US20020154132A1 - Texture mapping 3d graphic objects - Google Patents

Texture mapping 3d graphic objects Download PDF

Info

Publication number
US20020154132A1
US20020154132A1 US08/903,440 US90344097A US2002154132A1 US 20020154132 A1 US20020154132 A1 US 20020154132A1 US 90344097 A US90344097 A US 90344097A US 2002154132 A1 US2002154132 A1 US 2002154132A1
Authority
US
United States
Prior art keywords
texture
mapping
graphic
user
graphic object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US08/903,440
Inventor
Alain M. Dumesny
Christopher L. Fouts
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CA Inc
Platinum Technology International Inc
Original Assignee
Computer Associates Think Inc
Platinum Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Computer Associates Think Inc, Platinum Technology Inc filed Critical Computer Associates Think Inc
Priority to US08/903,440 priority Critical patent/US20020154132A1/en
Assigned to SILICON GRAPHICS, INC. reassignment SILICON GRAPHICS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DUMESNY, ALAIN M., FOUTS, CHRISTOPHER L.
Assigned to SILICON GRAPHICS, INC. reassignment SILICON GRAPHICS, INC. ASSIGNMENT OF ASSIGNOR'S INTEREST RE-RECORD TO CORRECT THE RECORDATION DATE OF 3-10-98 TO 3-11-98 PREVIOUSLY RECORDED AT REEL 9032, FRAME 344 Assignors: DUMESNY, ALAIN M., FOUTS, CHRISTOPHER L.
Assigned to PLATINUM TECHNOLOGY IP, INC. reassignment PLATINUM TECHNOLOGY IP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PLATINUM TECHNOLOGY, INC.
Assigned to PLATINUM TECHNOLOGY, INC. reassignment PLATINUM TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SILICON GRAPHICS, INC.
Assigned to COMPUTER ASSOCIATES THINK, INC. reassignment COMPUTER ASSOCIATES THINK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PLATINUM TECHNOLOGY IP, INC.
Publication of US20020154132A1 publication Critical patent/US20020154132A1/en
Priority to US11/227,779 priority patent/US7148899B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Definitions

  • This invention relates to mapping textures onto 3D graphic objects.
  • the computer system illustrated in FIG. 1- which includes mouse 15 , keyboard 16 , CPU 17 and CRT 18 -represents a hardware setup for running software that allows a user to model and/or render 3D graphic objects. Modeling and rendering of 3D graphic objects are used, for example, to generate 3D virtual scenes on the display screen of a computer system.
  • the surface topology of a 3D object can be modeled as a set of polygons of appropriate sizes and shapes joined at their edges.
  • the set of polygons defining the 3D object referred to as the “model” for the 3D object, is represented by a collection of coordinates residing in a logical space (referred to as the “object space”) defined by a set of three axes, x, y and z.
  • the 3D object is to be displayed in wire-frame form, only the edge lines that form the polygons in the model are rendered, thus generating a skeleton-like image of the object.
  • the model of an X- 29 aircraft shown in FIG. 2A consists of several hundred unfilled polygons joined together at their edges to form a wire-frame representation of the aircraft.
  • FIGS. 3 A- 3 H show four different examples of before and after views of various surface textures being applied to four different 3D objects--a cube (FIGS. 3 A- 3 B), a cylinder (FIGS. 3 C-3D), a sphere (FIGS. 3 E- 3 F) and a cone (FIGS. 3 G- 3 H).
  • Texture mapping in which an image or “texture map”--a 2 D array of pixel information, such as the map of Earth shown in FIG. 4--is overlaid on a 3D object's surface typically in a repeating manner, roughly similar to laying tiles on a floor.
  • the texture can be stretched to cover the surface under consideration.
  • a texture map resides in its own logical space (referred to as the “texture space”) defined by a pair of axes s and t as shown in FIG. 4.
  • the texture space is independent of an object space in which the surface to pology of a 3D object is defined.
  • a texture is mapped onto a 3D object by creating a mapping, or correlation, between coordinates in the object space and coordinates in the texture space.
  • Different mapping functions can be used to apply a texture to a 3D object depending on the object designer's preferences.
  • Many of the various different “primitives” each has an associated standard mapping function.
  • the standard texture mapping function for a cube is to apply the entire texture map to each face of the cube.
  • the standard mapping function for a cylinder is to apply the texture map to each end face of the cylinder and to wrap one instance of the entire texture map around the cylinder's circumference, as shown in FIG. 5B.
  • Standard texture mapping functions also exist for a sphere (FIG. 5C) and a cone (FIG. 5D).
  • a standard plane mapping function typically is used to apply a texture to the surface of a 3D object.
  • the 3D object's bounding box i.e., a logical 3D box that is just big enough to encompass the 3D object
  • texture for example, the texture map shown in FIG. 6B
  • plane mapping is accomplished by projecting the texture through the 3D graphic object, analogous to projecting an image from a slide projector onto a 3D physical object, as shown in FIG. 6C.
  • GUI graphical user interface
  • the utility includes a Texture Mapper window 70 having a preview region 72 in which a preview version of the 3D object 71 being texture-mapped is displayed.
  • a user designates a new or different texture for the 3D object by clicking the cursor on one of several different texture choices displayed in a texture palette 73 .
  • a preview version of the object 71 with the selected texture applied is displayed in the preview window 72 to give the user an indication of how the texture mapped object will appear when rendered at a full level of detail.
  • a texture mapping utility may provide other features that allow a user to specify certain aspects of the manner in which a texture is mapped to the 3D object. Such features may allow the user to translate the texture map relative to the 3D object, to select among repeating and stretching type mapping functions, or to specify a texture repetition interval.
  • a texture is applied to a 3D graphic object using a computer system by providing a graphic Indication of a mapping between a 3D graphic object and a texture map and modifying the mapping based on a Cants Elation of the graph c indication. Substantially at the same time as the modification of the mapping occurs, the 3D graphic object can be rendered based on the modified mapping. The modification of the mapping and the rendering of the 3D graphic object can occur in real time.
  • a graphic user interface Prior tt the modification of the mapping, user input specifying the manipulation of the graphic indication of the mapping may be received through a graphic user interface. This may include presenting a first display region representative of a texture space and at least one other display region representative of an object space.
  • the texture map may be displayed in the first display region and the 3D graphic object may be displayed in the other display region.
  • Provision of the graphic indication of the mapping may include displaying in the first display region a graphic element representative of the 3D graphic object at a location on the texture map corresponding to the mapping.
  • Manipulation of the graphic indication may include performing a direct manipulation operation on the graphic element displayed in the first display region.
  • the direct manipulation operation may include translating the graphic element to a new location on the texture map, scaling the graphic element relative to the texture map, rotating the graphic element relative to the texture map, or a combination thereof.
  • Input may be received from a user be performed.
  • Visual feedback may be provided as the direct manipulation operation is being performed.
  • the dhrect manipulation operation causes the graphic element to ex-tend beyond a boundary of the texture map, an instance of the texture map may be repeated automatically.
  • Modification of the mapping may be accomplished by modifying a mapping between coordinates in an object space and coordinates in a texture space.
  • the modification may involve altering only a portion of the mapping between the 3D graphic object and the texture map.
  • the 3D graphic object can be modelled as a plurality of polygons defined by nates in oh abasent space.
  • Modification of the mapping can be based on input received from a user specifying one or more polygons for which the mapping is to be modified.
  • a set of coordinates in the object space and a set of coordinates in the texture space may be selectively shared or not (e.g., based on user input) between adjacent polygons. Where coordinates are shared, modification of the mapping for a polygon also may modify the mapping for an adjacent polygon. Alternatively, modification of the mapping may alter the mapping for a polygon without altering the mapping for an adjacent polygon.
  • a texture is applied to a 3D graphic object using a computer system by providing a visual s indication of a correlation between a 3D graphic object and a texture map and modifying the correlation by selectively applying a predetermined mapping function between a texture space and an object space to a portion of a 3D graphic object.
  • Modification of the correlation may involve selectively applying one of several available predetermined mapping functions between a texture space and an object space to different portions of the 3D graphic object.
  • User input may be received (e.g., through a graphical user interface) that specifies one of the predetermined mapping functions which is to be applied to the portion of the 3D graphic object.
  • User input also may be received that specifies different portions of the 3D graphic object to which the specified mapping functions are to be applied.
  • a user interface for applying a texture to a 3D graphic object includes an object region displaying a 3D graphic object formed of polygons, a texture region displaying a texture map, and a marker (e.g., a graphic indication) within the texture region defining a mapping between a polygon of the 3D graphic object and the texture map.
  • the marker may appear visually similar to a corresponding polygon forming the 3D graphic object. Movement of the marker within the texture map may cause a corresponding change in the mapping between the polygon of the 3D graphic object and the texture map.
  • Several markers may be displayed within the texture region each of which defines a mapping between a corresponding polygon of the 3D graphic object and the texture map. In that case, one of the markers can be moved independently of the other markers. Movement of one of the markers selectively may or may not affect the mappings of adjacent markers.
  • texture mapping methods and techniques described here may include one or more of the following.
  • the texture manipulation techniques and methods described here represent a rich and powerful set of tools with which texture mappings can be adjusted or fine tuned according to a user's preferences. As a result, users are able to generate highly realistic textures on 3D objects having an arbitrarily complex surface topologies.
  • the texture manipulation tools are presented in an intuitive graphic user interface that allows users to modify texture mappings using direct manipulation techniques. Users are guided in their texture modifications through real time visual feedback. As a result, users can craft texture mappings to their liking through a few simple, fast and easy steps.
  • the texture manipulation tools provide a dramatic increase in the flexibility with which texture mappings can be modified. Users have the option of modifying a texture mapping for any individual polygon in the 3D object or for an arbitrary set of the object's polygons.
  • the texture modification can be accomplished through any of three different types of operations (translating, scaling, rotating), further increasing the flexibility available to users.
  • users are given the option of modifying the texture mapping for an object polygon independently of the neighboring object polygons.
  • FIG. 1 shows a computer system displaying a 3D image.
  • FIGS. 2A and 2B are views of an X- 29 aircraft rendered in wire-frame and shaded representations, respectively.
  • FIGS. 3 A- 3 H show before and after views of four different example of applying a texture to the surface of a 3D graphic object.
  • FIG. 4 is an example of a texture map.
  • FIGS. 5 A- 5 D illustrate the standard texture mapping functions for four different primitives.
  • FIGS. 6 A- 6 C show an example of a texture applied to a 3D object having an arbitrary surface topology.
  • FIG. 7 shows an example of a graphic user interface utility for applying textures to 3D graphic objects.
  • FIGS. 8 A- 8 C show examples of artifacts that can arise when a texture is applied to a 3D object having an arbitrary surface topology.
  • FIGS. 9 A- 9 B and 10 show how object polygons are mapped to a texture in the PEP Texture Applicator window.
  • FIGS. 11 A- 11 B are before and after views of modifying a texture mapping by translating a single texture coordinate.
  • FIGS. 12 A- 12 B are before and after views of modifying a texture mapping by translating multiple texture coordinates.
  • FIGS. 13 A- 13 B are before and after views of modifying a texture mapping by scaling texture coordinates.
  • FIG. 14 illustrates the automatic texture map repetition feature of the PEP Texture Applicator.
  • FIGS. 15 A- 15 B are before and after views of modifying a texture mapping by rotating texture coordinates.
  • FIGS. 16 A- 16 E are successive screen shots showing an example of breaking shared texture coordinates between object polygons.
  • FIGS. 17 A- 17 J are successive screen shots showing an example of modifying a texture mapping to correct artifacts.
  • Silicon Graphics Cosmo'Worlds authoring environment provides a rich body of texture application and manipulation functionality that enables a user of a computer system to apply a texture and fine tune it until the object has the desired appearance.
  • CosmoM Worlds texture manipulation features enable users to make the object appear unflawed and realistic.
  • Cosmo′ Worlds implements this and other functionality through intuitive visual manipulation tools that allow users to easily and directly manipulate texture mappings applied to 3D graphic objects.
  • these tools enable users to orient, align, position and size a texture map relative to a 3D graphic object and to apply different standard mapping functions to different portions of a single 3D graphic object.
  • Online documentation, incorporated herein by reference, describing the operation and features of CosmoM Worlds may be accessed at the following location on the World Wide Web:
  • FIGS. 8 A- 8 C show examples of two types of artifacts-“smearing” and “smudging”--which can occur when a texture is mapped onto the surface of a 3D object using the standard plane texture mapping function.
  • the multi-colored polka dot texture map 80 of FIG. 8A was mapped onto a 3D object 81 (a chess pawn) having a relatively complex surface topology, with the result of the mapping shown in FIG. 8B.
  • texture mapping artifacts may not be readily apparent when the 3D object is rendered in a frontal view (i.e., looking straight ahead at the largest face of the bounding box) depending on the surface topology of the 3D object under consideration.
  • the texture mapping artifacts are readily apparent. For example, “smearing” of the polka dot texture (stretching the texture to the point of distortion) is evident on a face 82 of the object 81 in FIG. 8B. Similarly, “smudging” of the polka dot texture (compressing the texture to the point of distortion) is evident on a face 83 when the texture is applied to a different object 84 (a multi-fold screen) as shown in FIG. 8C.
  • the texture mapping artifacts illustrated in FIGS. 8B and 8C arise when a finite area of the surface of the object is mapped to a disproportionately sized region in the texture map. Smearing occurs when a designated area of the object is disproportionately large in comparison to the area of the texture to which the object is mapped. In effect, the texture's appearance becomes distorted as shown in FIG. 8B due to the extent to which the texture map must be stretched to cover the associated area of the 3D object.
  • Cosmo′ Worlds includes a Texture Applicator utility which provides an interactive GUI that enables users to correct artifacts such as smearing and smudging, or otherwise manipulate a texture mapping, by adjusting the relationship between individual points of the 3D object model and locations of texture coordinates on the texture map.
  • User interaction with the Texture Applicator utility is based on Cosmo' m Worlds'PEP (Point-Edge-Polygon) editing paradigm.
  • PEP editing users can designate and manipulate one or more of the points, edges or polygons (or any combination thereof) that make up a 3D object, rather than manipulating the 3D object as a whole.
  • PEP editing is discussed in further detail in the online documentation referenced above, and in U.S.
  • the Texture Applicator employs a direct, as opposed to an indirect, object manipulation paradigm.
  • indirect manipulation tools users do not work on the graphic object itself but rather use a separate graphical abstraction (e.g., a slider bar or a thumbwheel), typically appearing at a fixed location on the display screen, to bring about corresponding changes to the graphic object.
  • Direct manipulation tools allow a user to manipulate a graphic object directly by placing the cursor in the proximity of the object, and dragging the cursor within the scene to affect the desired change to the object.
  • users can modify a texture mapping by directly adjusting positions within the texture space to which associated polygons of the 3D object are mapped.
  • a direct manipulation tool such as the Texture Applicator, enhances the intuitiveness of a user's interactions with the tool and thus enables the user to accomplish a desired object manipulation more quickly, easily and precisely.
  • the Texture Applicator window 90 includes two regions: an object space region 91 on the right which displays a rendering of the 3D object 81 to which a texture is applied and a texture space region 92 on the left which provides a visual indication of how the texture map 93 is to be applied to the object 81 .
  • an object space region 91 on the right which displays a rendering of the 3D object 81 to which a texture is applied
  • a texture space region 92 on the left which provides a visual indication of how the texture map 93 is to be applied to the object 81 .
  • vl“object polygons” a desired arbitrary mapping into the texture space for one or more of the individual polygons forming the 3D object.
  • the user first designates within the object space region 91 the one or more object polygons for which the texture mapping is to be manipulated. If no object polygons are selected, subsequent texture coordinate manipulation operations are performed on all object polygons forming the 3D object.
  • the user has selected for texture mapping manipulation a single object polygon 94 , as indicated by the surrounding highlighting visual feedback.
  • the perimeter of the selected object polygon 94 is rimmed with a white line having a red stripe and white/red circles appear at each of the polygon's apexes.
  • three corresponding texture coordinates forming region 95 are displayed in the texture space region 92 at a location corresponding to the current texture mapping for the selected object polygon 94 .
  • the lines between texture coordinates are intended to provide users with information about the relationships between texture coordinates.
  • the user may select different or additional object polygons in the object space region 91 and their respective mappings into the texture space will be represented as corresponding sets of three or more texture coordinates in the texture space region 92 .
  • the user in this example has selected two additional object polygons 96 and 97 on either side of object polygon 94 and their respective mappings into the texture space are displayed accordingly in texture space region 92 .
  • FIG. 10 which shows the texture mapping for a top end surface of a cylinder 100
  • the user has selected all of the 16 wedge-shaped object polygons 101 that form the cylinder's end surface.
  • the corresponding mappings into the texture space for the selected object polygons are represented by the circular mesh 102 formed from 17 texture coordinates as displayed in the texture space region 92 .
  • FIGS. 11 A- 11 B illustrate an example of modifying a texture mapping using translation, as indicated by the active state of the arrow button 118 in the Texture Applicator window 90 .
  • the user has selected in object space region 91 a single object polygon 110 which corresponds to an entire face of cube 112 .
  • the associated mapping into the texture space for object polygon 110 is indicated in texture space region 92 by four texture coordinates ( 113 , 114 , 115 , 116 ) which form a square region 111 .
  • the user can modify the texture mapping for object polygon 110 by moving one or more of the four texture coordinates ( 113 , 114 , 115 or 116 ) within the texture space region 92 .
  • the user in this example has selected a single texture coordinate 113 .
  • the user modifies the texture mapping by dragging texture coordinate 113 toward the center of the texture map thereby transforming the square region 111 into an irregularly shaped region 117 , as shown in FIG. 11B.
  • the display of the cube 112 in the object space region 91 is updated in real time to reflect the current texture mapping as it is being modified.
  • the user's cursor manipulations have caused a reduced area of the texture map (i.e., the area bounded by region 117 in FIG. 11B) to be mapped to face 110 of cube 112 . Because the area of the texture map bounded by region 117 is now smaller than the area of face 110 , the designated portion of the texture map is stretched to cover face 110 , giving the cube 112 the appearance shown in Fig. liB.
  • FIGS. 12 A- 12 B which show another example of modifying a texture mapping using translation
  • the user has designated a face 121 of a cube 120 as the object polygon for which texture mapping is to be modified.
  • the corresponding mapping into the texture is represented by four texture coordinates ( 123 , 124 , 125 and 126 ) which form region 122 , and which occupy only a subset of the overall texture map 127 .
  • the user has selected all four of the texture coordinates ( 123 , 124 , 125 and 126 ) that form the region 122 and has dragged them as a group to a new location on the texture map 127 as shown in FIG. 12B.
  • the display of the cube 120 in the object space region 91 is changed in real time to reflect the modified texture mapping, which in this case does not require any stretching or compression of the texture but rather simply reflects a different subset of the texture map 127 .
  • FIGS. 13 A- 13 B illustrate an example of modifying a texture mapping using scaling, as indicated by the active state of the scaling button 130 in the Texture Applicator window 90 .
  • the user has selected a single object polygon 131 which corresponds to an entire face of cube 132 .
  • the associated mapping into the texture space for object polygon 131 is indicated in texture space region 92 by the four texture coordinates forming region 133 .
  • the user then can modify the texture mapping for the selected object polygon by scaling the four texture coordinates forming region 133 as desired (larger or smaller) about a point (the “scale-about” point 134 ) specified by the user.
  • Positioning of the scale-about point is accomplished by clicking on a button 119 (or simply by clicking the cursor on the scale-about point provided it is visible in the current view) and then dragging the scale-about point to the desired location.
  • the user in this example has positioned the scale-about point 134 at the center of region 133 (which happens to coincide with the center of the texture map 135 ).
  • visual feedback is provided to the user in the form of a four-headed arrow 137 (to indicate that the cursor may be moved in any direction) and a box 138 formed of blue-andwhite dashed lines which rubber-band to continuously occupy the space between the scale-about point 134 and the current cursor point as the user moves the cursor within texture space region 92 (to give the user a sense of the relative distance over which scaling has occurred).
  • the user in this example has dragged the cursor from the click-down point 136 to a point 139 , and the region 133 that corresponds to the texture mapping accordingly has shrunk to roughly a quarter of its original size.
  • the resultant texture mapping for face 131 showing an effective magnification of the texture relative to its appearance in FIG. 13A, is displayed in object space region 91 .
  • the user could have increased the size of the area bounded by region 133 (i.e., the area of the texture map to which a selected object polygon is mapped) by dragging the cursor away from the designated scale-about point. In that case, a larger portion of the texture map 135 would have been mapped to face 131 as a result.
  • the Texture Applicator utility repeats the texture map automatically as needed. As shown in FIG. 14, for example, after selecting face 141 of cube 140 for texture mapping modification, the user has dragged one of the points (point 143 ) that defines the region 142 to the right beyond the border of the texture map 144 . In response, the texture map 144 is repeated automatically causing another instance 145 of the texture map to appear in the texture space region 92 . In response to user interactions, the texture map will continue to be repeated automatically in any or all directions to accommodate virtually any size of mapping. Similarly, repeated instances of the texture map will disappear as needed in response to a reduction in the size of the mapping.
  • FIGS. 15 A- 15 B illustrate an example of modifying a texture mapping using rotation, as indicated by the active state of the rotation button 159 in the Texture Applicator window 90 .
  • the user has selected a single object polygon 151 which corresponds to an entire face of cube 150 .
  • the associated mapping into the texture space for the face 151 is indicated in texture space region 92 as four texture coordinates forming a square region 152 .
  • the user then can modify the texture mapping for the face 151 by rotating selected texture coordinates as desired (clockwise or counterclockwise) about a point (the “rotation point”) specified by the user.
  • Positioning of the rotation point is accomplished in the same manner as positioning of the scale-about point during scaling--namely, by clicking on the button 119 (or simply by clicking the cursor on the rotation point provided it is visible in the current view) and then dragging the rotation point to the desired location.
  • the user in this example has positioned the rotation point 153 at the center of region 152 (which happens to coincide with the center of the texture map 154 ).
  • visual feedback is provided to the user in the form of a partial circle 156 formed of two arced arrows (to indicate that the cursor may be rotated in either direction) and a blue-and-white dashed line 157 which rubber-bands to continuously extend between the rotation point 153 and the current cursor point as the user moves the cursor within texture space region 92 (to give the user a sense of the relative degree to which rotation has occurred).
  • each object polygon in a 3D object shares both object coordinates and texture coordinates with its neighboring polygons.
  • the texture mappings for one polygon changes necessarily. Sometimes this behavior is desirable but other times it is not.
  • the Texture Applicator allows users to sever shared texture coordinate relationships between object polygons selectively (referred to as “breaking instancing”), thus enabling each object polygon (or a combination of two or more object polygons) to have a separate texture mapping independent of the texture mappings for other object polygons in the 3D object.
  • FIGS. 1 GA- 16 E illustrate an example of breaking instancing between adjacent object polygons so that they no longer share texture coordinates in the texture space.
  • the 3D object is a multi--fold screen 160 formed of 16 object polygons as shown in FIG. 16A. All 16 of the object polygons have been selected and the corresponding mapping into the texture space is represented by the grid 161 (formed of 25 corresponding texture coordinates) displayed in the texture space region 92 .
  • FIG. 16B when the user drags the four texture coordinates that define region 163 (which corresponds to the mapping for object polygon 164 ) to a new location on the texture map 162 , the eight regions surrounding region 163 deform to follow its movement. This is because the four texture coordinates that were moved are shared by the nine different object polygons. As a result, the texture mappings for object polygon 164 and its eight surrounding object polygons are changed in a corresponding manner.
  • the user desires to modify the texture mapping for one or more object polygons without affecting the texture mappings of its neighboring object polygons, the user can break instancing between a designated object polygon and its neighbors by selecting the “break instancing” option under the Edit menu of the Texture Applicator utility.
  • the user has selected object polygon 164 and has broken instancing between object polygon 164 and its neighbors as shown in FIG. 16C. Consequently, the user now can modify the texture mapping for object polygon 164 without affecting the texture mappings of its neighbors by moving the corresponding four texture coordinates forming region 163 to the desired location on the texture map 162 .
  • FIG. 16D the user has moved the four texture coordinates which form region 163 slightly up and to the right thereby modifying the texture mapping for object polygon 164 , but not for any of the other object polygons.
  • the overall image on the screen 161 appears in FIG. 16D as a modified self-portrait of Vincent van Gogh having three eyes.
  • the texture mapping for object polygon 164 changes accordingly as shown in FIG. 16E.
  • FIGS. 17 A- 17 J illustrate an example of using the various texture manipulation features provide by the Texture Applicator to fine tune a texture mapping according to a user's preferences.
  • FIG. 17A shows the Texture Applicator window for a chess pawn 170 having a polka dot texture map 171 which was applied using the standard plane texture mapping function, and which has resulted in smearing of the texture across several object polygons in the region 172 .
  • the user has a number of options available, including being able to designate a “best fit” mapping function (under the “Mapping” menu choice) for applying the texture map 171 to one or more selected portions of the pawn 170 .
  • the user has not selected any of the object polygons forming the pawn 170 so any texture mapping modification that is performed will operate on all of the pawn's object polygons.
  • the user first reorients the rendering of the pawn 170 in the object space region 91 (through direct object manipulations or by using the Rotx, Roty and Dolly thumbwheels as appropriate) to make the pawn's bottom surface 173 visible to the user as shown in FIG. 17C.
  • the corresponding mappings into the texture space are displayed in the texture space region 92 as a red line 174 with small red blocks marking the associated texture coordinates. Because the 24 object polygons are mapped to a single line 174 in the texture space, the pawn's bottom surface appears to have little or none of the texture map applied to it.
  • the user has specified a new “best fit” mapping function for the pawn's bottom surface--namely, the standard plane texture mapping function.
  • a standard plane texture mapping function is applied to the 24 object polygons forming the pawn's bottom surface 173 , as indicated by the changed appearance of the pawn's bottom surface 173 in FIG. 17E.
  • the new texture mapping is displayed in the texture space region 92 as an octagonal mapping mesh 175 of 17 texture coordinates forming 24 regions which effectively are laid directly on top of the texture map 171 .
  • the “Plane:Best” texture mapping function specified by the user is applied only to the currently selected portion of the object--in this example, the 24 object polygons forming the pawn's bottom surface 173 .
  • the texture mappings for the object's other, unselected polygons are unaffected.
  • the user next has decided that the texture dots on the bottom surface 173 of the pawn 170 appear too small relative to the other dots on the body of the pawn.
  • the user scales the mapping mesh 175 (and thus all of the 17 texture coordinates forming the mesh) about its center point 17 G down to roughly one-fourth of its original size, as shown in FIG. 17G.
  • the texture dots on the pawn's bottom surface 173 are magnified to an extent that only about four texture dots can fit on the bottom surface 173 .
  • the user has decided to correct some of the remaining texture smearing in region 172 and so has re-oriented the display of the pawn 170 So that region 172 is readily visible as shown in FIG. 17H.
  • the user has selected five contiguous object polygons 177 on the top surface of the pawn's base, as shown in FIG. 17I.
  • the corresponding mappings into the texture space are represented by the 12 texture coordinates 178 displayed in the texture space region 92 .
  • the Texture Applicator utility has zoomed in on (i.e., magnified) the texture space region 92 automatically so that the texture coordinates are readily visible to the user.
  • the Texture Applicator utility displays the texture coordinates as large as possible to fit within the texture space region 92 . This auto-zoom feature can be disabled by a user if desired.
  • the Texture Applicator utility also provides a full range of viewing capabilities which allow users to manually zoom in/out, dolly, rotate, etc. as desired.
  • the texture smearing in this example is a result of mapping a relatively thin slice of a row of texture dots 179 across the relatively larger object polygons 177 .
  • the user has modified the texture mapping for the object polygons 177 by moving the 12 texture coordinates which form region 178 slightly upward to a location on the texture map 171 between two rows of texture dots (i.e., between row 179 and row 180 ).
  • the texture mapping no longer requires the row 179 of texture dots to be stretched across the object polygons 177 and the smearing effect has been reduced dramatically.
  • Each such computer program is preferably stored on a storage medium or device (e.g., CD-ROM, hard disk or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform the procedures described in this document.
  • a storage medium or device e.g., CD-ROM, hard disk or magnetic diskette
  • the system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium 50 configured causes a computer to operate in a specific and predefined manner.

Abstract

A texture is applied to a 3D graphic object using a computer system by providing a graphic indication of a mapping between a 3D graphic object and a texture map and modifying the mapping based on a manipulation of the graphic indication. Alternatively, or in addition, a texture can be applied to a 3D graphic object by selectively applying one or more predetermined mapping functions between a texture space and an object space to designated portions of the 3D graphic object. The modification of mappings, and application of the mapping functions, can be based on input received from a user of the computer system, for example, by means of direct manipulation techniques applied in a graphical user interface.

Description

    BACKGROUND
  • This invention relates to mapping textures onto 3D graphic objects. [0001]
  • The computer system illustrated in FIG. 1-which includes [0002] mouse 15, keyboard 16, CPU 17 and CRT 18-represents a hardware setup for running software that allows a user to model and/or render 3D graphic objects. Modeling and rendering of 3D graphic objects are used, for example, to generate 3D virtual scenes on the display screen of a computer system. The surface topology of a 3D object can be modeled as a set of polygons of appropriate sizes and shapes joined at their edges. The set of polygons defining the 3D object, referred to as the “model” for the 3D object, is represented by a collection of coordinates residing in a logical space (referred to as the “object space”) defined by a set of three axes, x, y and z. If the 3D object is to be displayed in wire-frame form, only the edge lines that form the polygons in the model are rendered, thus generating a skeleton-like image of the object. For example, the model of an X-29 aircraft shown in FIG. 2A consists of several hundred unfilled polygons joined together at their edges to form a wire-frame representation of the aircraft.
  • A more realistic rendering of the 3D object can be achieved by shading the polygons with various colors as appropriate, for example, as shown in FIG. 2B, by taking into account lighting and surface properties of the 3D object under consideration. The degree of realism or detail can be enhanced further by adding a texture to the surface of the 3D object. FIGS. [0003] 3A-3H show four different examples of before and after views of various surface textures being applied to four different 3D objects--a cube (FIGS. 3A-3B), a cylinder (FIGS. 3C-3D), a sphere (FIGS. 3E-3F) and a cone (FIGS. 3G-3H).
  • Applying a surface texture to a 3D object is referred to as “texture mapping” in which an image or “texture map”--a [0004] 2D array of pixel information, such as the map of Earth shown in FIG. 4--is overlaid on a 3D object's surface typically in a repeating manner, roughly similar to laying tiles on a floor. Alternatively, the texture can be stretched to cover the surface under consideration. A texture map resides in its own logical space (referred to as the “texture space”) defined by a pair of axes s and t as shown in FIG. 4. The texture space is independent of an object space in which the surface to pology of a 3D object is defined.
  • A texture is mapped onto a 3D object by creating a mapping, or correlation, between coordinates in the object space and coordinates in the texture space. Different mapping functions can be used to apply a texture to a 3D object depending on the object designer's preferences. Many of the various different “primitives” (fundamental surface topologies) each has an associated standard mapping function. As shown in FIG. 5A, for example, the standard texture mapping function for a cube is to apply the entire texture map to each face of the cube. The standard mapping function for a cylinder is to apply the texture map to each end face of the cylinder and to wrap one instance of the entire texture map around the cylinder's circumference, as shown in FIG. 5B. Standard texture mapping functions also exist for a sphere (FIG. 5C) and a cone (FIG. 5D). [0005]
  • For non-primitive objects (e.g., objects having arbitrary surface topologies such as the bust of the female head shown in FIG. 6A), a standard plane mapping function typically is used to apply a texture to the surface of a 3D object. First, the 3D object's bounding box (i.e., a logical 3D box that is just big enough to encompass the 3D object) is calculated and then texture (for example, the texture map shown in FIG. 6B) is mapped to the largest face of the bounding box. In effect, plane mapping is accomplished by projecting the texture through the 3D graphic object, analogous to projecting an image from a slide projector onto a 3D physical object, as shown in FIG. 6C. These and other texture mapping techniques are discussed in Josie Wernecke, The Inventor Mentor: Programming Object-Oriented 3D Graphics with Open Inventor, Release [0006] 2, chapter 7, Addison-Wesley (1994), which is incorporated by reference.
  • Users can designate a texture to be applied to a 3D object by using an interactive graphical user interface (GUI) utility such as that shown in FIG. 7. The utility includes a [0007] Texture Mapper window 70 having a preview region 72 in which a preview version of the 3D object 71 being texture-mapped is displayed. A user designates a new or different texture for the 3D object by clicking the cursor on one of several different texture choices displayed in a texture palette 73. In response, a preview version of the object 71 with the selected texture applied is displayed in the preview window 72 to give the user an indication of how the texture mapped object will appear when rendered at a full level of detail.
  • A texture mapping utility may provide other features that allow a user to specify certain aspects of the manner in which a texture is mapped to the 3D object. Such features may allow the user to translate the texture map relative to the 3D object, to select among repeating and stretching type mapping functions, or to specify a texture repetition interval. [0008]
  • SUMMARY
  • In one aspect, a texture is applied to a 3D graphic object using a computer system by providing a graphic Indication of a mapping between a 3D graphic object and a texture map and modifying the mapping based on a Cants Elation of the graph c indication. Substantially at the same time as the modification of the mapping occurs, the 3D graphic object can be rendered based on the modified mapping. The modification of the mapping and the rendering of the 3D graphic object can occur in real time. [0009]
  • Prior tt the modification of the mapping, user input specifying the manipulation of the graphic indication of the mapping may be received through a graphic user interface. This may include presenting a first display region representative of a texture space and at least one other display region representative of an object space. The texture map may be displayed in the first display region and the 3D graphic object may be displayed in the other display region. Provision of the graphic indication of the mapping may include displaying in the first display region a graphic element representative of the 3D graphic object at a location on the texture map corresponding to the mapping. Manipulation of the graphic indication may include performing a direct manipulation operation on the graphic element displayed in the first display region. The direct manipulation operation may include translating the graphic element to a new location on the texture map, scaling the graphic element relative to the texture map, rotating the graphic element relative to the texture map, or a combination thereof. Input may be received from a user be performed. Visual feedback may be provided as the direct manipulation operation is being performed. Moreover, if the dhrect manipulation operation causes the graphic element to ex-tend beyond a boundary of the texture map, an instance of the texture map may be repeated automatically. [0010]
  • Modification of the mapping may be accomplished by modifying a mapping between coordinates in an object space and coordinates in a texture space. The modification may involve altering only a portion of the mapping between the 3D graphic object and the texture map. The 3D graphic object can be modelled as a plurality of polygons defined by nates in oh abasent space. Modification of the mapping can be based on input received from a user specifying one or more polygons for which the mapping is to be modified. A set of coordinates in the object space and a set of coordinates in the texture space may be selectively shared or not (e.g., based on user input) between adjacent polygons. Where coordinates are shared, modification of the mapping for a polygon also may modify the mapping for an adjacent polygon. Alternatively, modification of the mapping may alter the mapping for a polygon without altering the mapping for an adjacent polygon. [0011]
  • In another aspect, a texture is applied to a 3D graphic object using a computer system by providing a visual s indication of a correlation between a 3D graphic object and a texture map and modifying the correlation by selectively applying a predetermined mapping function between a texture space and an object space to a portion of a 3D graphic object. Modification of the correlation may involve selectively applying one of several available predetermined mapping functions between a texture space and an object space to different portions of the 3D graphic object. User input may be received (e.g., through a graphical user interface) that specifies one of the predetermined mapping functions which is to be applied to the portion of the 3D graphic object. User input also may be received that specifies different portions of the 3D graphic object to which the specified mapping functions are to be applied. [0012]
  • In another aspect, a user interface for applying a texture to a 3D graphic object includes an object region displaying a 3D graphic object formed of polygons, a texture region displaying a texture map, and a marker (e.g., a graphic indication) within the texture region defining a mapping between a polygon of the 3D graphic object and the texture map. The marker may appear visually similar to a corresponding polygon forming the 3D graphic object. Movement of the marker within the texture map may cause a corresponding change in the mapping between the polygon of the 3D graphic object and the texture map. Several markers may be displayed within the texture region each of which defines a mapping between a corresponding polygon of the 3D graphic object and the texture map. In that case, one of the markers can be moved independently of the other markers. Movement of one of the markers selectively may or may not affect the mappings of adjacent markers. [0013]
  • Advantages of the texture mapping methods and techniques described here may include one or more of the following. The texture manipulation techniques and methods described here represent a rich and powerful set of tools with which texture mappings can be adjusted or fine tuned according to a user's preferences. As a result, users are able to generate highly realistic textures on 3D objects having an arbitrarily complex surface topologies. The texture manipulation tools are presented in an intuitive graphic user interface that allows users to modify texture mappings using direct manipulation techniques. Users are guided in their texture modifications through real time visual feedback. As a result, users can craft texture mappings to their liking through a few simple, fast and easy steps. [0014]
  • The texture manipulation tools provide a dramatic increase in the flexibility with which texture mappings can be modified. Users have the option of modifying a texture mapping for any individual polygon in the 3D object or for an arbitrary set of the object's polygons. The texture modification can be accomplished through any of three different types of operations (translating, scaling, rotating), further increasing the flexibility available to users. Moreover, by allowing users to specify whether or not texture coordinates are to be shared between adjacent object polygon's, users are given the option of modifying the texture mapping for an object polygon independently of the neighboring object polygons. [0015]
  • By enabling a user to apply various different standard texture mapping functions to designated portions of a 3D object, the user can correct texture mapping artifacts on the object's coarse features quickly and easily. As a result, the task of fine tuning the texture mappings for the 3D object's remaining features is simplified considerably. [0016]
  • Other advantages and features will become apparent from the following description, including the drawings and claims.[0017]
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a computer system displaying a 3D image. [0018]
  • FIGS. 2A and 2B are views of an X-[0019] 29 aircraft rendered in wire-frame and shaded representations, respectively.
  • FIGS. [0020] 3A-3H show before and after views of four different example of applying a texture to the surface of a 3D graphic object.
  • FIG. 4 is an example of a texture map. [0021]
  • FIGS. [0022] 5A-5D illustrate the standard texture mapping functions for four different primitives.
  • FIGS. [0023] 6A-6C show an example of a texture applied to a 3D object having an arbitrary surface topology.
  • FIG. 7 shows an example of a graphic user interface utility for applying textures to 3D graphic objects. [0024]
  • FIGS. [0025] 8A-8C show examples of artifacts that can arise when a texture is applied to a 3D object having an arbitrary surface topology.
  • FIGS. [0026] 9A-9B and 10 show how object polygons are mapped to a texture in the PEP Texture Applicator window.
  • FIGS. [0027] 11A-11B are before and after views of modifying a texture mapping by translating a single texture coordinate.
  • FIGS. [0028] 12A-12B are before and after views of modifying a texture mapping by translating multiple texture coordinates.
  • FIGS. [0029] 13A-13B are before and after views of modifying a texture mapping by scaling texture coordinates.
  • FIG. 14 illustrates the automatic texture map repetition feature of the PEP Texture Applicator. [0030]
  • FIGS. [0031] 15A-15B are before and after views of modifying a texture mapping by rotating texture coordinates.
  • FIGS. [0032] 16A-16E are successive screen shots showing an example of breaking shared texture coordinates between object polygons.
  • FIGS. [0033] 17A-17J are successive screen shots showing an example of modifying a texture mapping to correct artifacts.
  • DETAILED DESCRIPTION
  • Although standard texture mapping techniques have proven useful in creating renderings of textured 3D objects with varying degrees of realism, they are primarily intended for application to 3D object primitives or other simple surface topologies. When the standard techniques are applied to a 3D object having an arbitrarily complex surface topology surface artifacts and other flaws that mar the appearance of the textured object may appear. [0034]
  • Silicon Graphics Cosmo'Worlds authoring environment provides a rich body of texture application and manipulation functionality that enables a user of a computer system to apply a texture and fine tune it until the object has the desired appearance. Even when textures are applied to 3D objects having arbitrarily complex surface topologies, CosmoM Worlds texture manipulation features enable users to make the object appear unflawed and realistic. Cosmo′ Worlds implements this and other functionality through intuitive visual manipulation tools that allow users to easily and directly manipulate texture mappings applied to 3D graphic objects. Among other features, these tools enable users to orient, align, position and size a texture map relative to a 3D graphic object and to apply different standard mapping functions to different portions of a single 3D graphic object. Online documentation, incorporated herein by reference, describing the operation and features of CosmoM Worlds may be accessed at the following location on the World Wide Web: [0035]
  • http://cosmo. sgi . com/worlds/support/CosmoWorlds_UG/ [0036]
  • A copy of selected portions of the documentation is attached as Appendix A. [0037]
  • FIGS. [0038] 8A-8C show examples of two types of artifacts-“smearing” and “smudging”--which can occur when a texture is mapped onto the surface of a 3D object using the standard plane texture mapping function. In the example shown in FIGS. 8A and 8B, the multi-colored polka dot texture map 80 of FIG. 8A was mapped onto a 3D object 81 (a chess pawn) having a relatively complex surface topology, with the result of the mapping shown in FIG. 8B.
  • When a texture is applied to an object. using the standard plane mapping function, texture mapping artifacts may not be readily apparent when the 3D object is rendered in a frontal view (i.e., looking straight ahead at the largest face of the bounding box) depending on the surface topology of the 3D object under consideration. However, when viewing a rendering of a 3D object as in FIGS. 8B or [0039] 8C, the texture mapping artifacts are readily apparent. For example, “smearing” of the polka dot texture (stretching the texture to the point of distortion) is evident on a face 82 of the object 81 in FIG. 8B. Similarly, “smudging” of the polka dot texture (compressing the texture to the point of distortion) is evident on a face 83 when the texture is applied to a different object 84 (a multi-fold screen) as shown in FIG. 8C.
  • The texture mapping artifacts illustrated in FIGS. 8B and 8C arise when a finite area of the surface of the object is mapped to a disproportionately sized region in the texture map. Smearing occurs when a designated area of the object is disproportionately large in comparison to the area of the texture to which the object is mapped. In effect, the texture's appearance becomes distorted as shown in FIG. 8B due to the extent to which the texture map must be stretched to cover the associated area of the 3D object. [0040]
  • Conversely, smudging occurs when the designated area of the object is disproportionately small in comparison to the area of the texture to which the object is mapped. In effect, the texture's appearance becomes distorted as shown in FIG. 8B due to the extent to which the texture map must be compressed to fit within the associated area of the 3D object. [0041]
  • Cosmo′ Worlds includes a Texture Applicator utility which provides an interactive GUI that enables users to correct artifacts such as smearing and smudging, or otherwise manipulate a texture mapping, by adjusting the relationship between individual points of the 3D object model and locations of texture coordinates on the texture map. User interaction with the Texture Applicator utility is based on Cosmo'[0042] m Worlds'PEP (Point-Edge-Polygon) editing paradigm. With PEP editing, users can designate and manipulate one or more of the points, edges or polygons (or any combination thereof) that make up a 3D object, rather than manipulating the 3D object as a whole. PEP editing is discussed in further detail in the online documentation referenced above, and in U.S. patent application Ser. No. 08/845,857, entitled “MANIPULATING GRAPHIC OBJECTS IN 3D SCENES,” which is incorporated by reference.
  • The Texture Applicator employs a direct, as opposed to an indirect, object manipulation paradigm. With indirect manipulation tools, users do not work on the graphic object itself but rather use a separate graphical abstraction (e.g., a slider bar or a thumbwheel), typically appearing at a fixed location on the display screen, to bring about corresponding changes to the graphic object. Direct manipulation tools, in contrast, allow a user to manipulate a graphic object directly by placing the cursor in the proximity of the object, and dragging the cursor within the scene to affect the desired change to the object. In the case of the Texture Applicator, users can modify a texture mapping by directly adjusting positions within the texture space to which associated polygons of the 3D object are mapped. In general, a direct manipulation tool, such as the Texture Applicator, enhances the intuitiveness of a user's interactions with the tool and thus enables the user to accomplish a desired object manipulation more quickly, easily and precisely. [0043]
  • As shown in FIGS. [0044] 9A-9B, the Texture Applicator window 90 includes two regions: an object space region 91 on the right which displays a rendering of the 3D object 81 to which a texture is applied and a texture space region 92 on the left which provides a visual indication of how the texture map 93 is to be applied to the object 81. By performing appropriate mouse manipulations within these two regions 91 and 92, a user can specify a desired arbitrary mapping into the texture space for one or more of the individual polygons forming the 3D object (referred to as vl“object polygons”). To do so, the user first designates within the object space region 91 the one or more object polygons for which the texture mapping is to be manipulated. If no object polygons are selected, subsequent texture coordinate manipulation operations are performed on all object polygons forming the 3D object.
  • In the example of FIG. 9A, the user has selected for texture mapping manipulation a single object polygon [0045] 94, as indicated by the surrounding highlighting visual feedback. Specifically, the perimeter of the selected object polygon 94 is rimmed with a white line having a red stripe and white/red circles appear at each of the polygon's apexes. In response to the selection of the object polygon 94, three corresponding texture coordinates forming region 95 (formed from red lines connecting the texture coordinates with a small red block at each apex) are displayed in the texture space region 92 at a location corresponding to the current texture mapping for the selected object polygon 94. The lines between texture coordinates are intended to provide users with information about the relationships between texture coordinates. The user may select different or additional object polygons in the object space region 91 and their respective mappings into the texture space will be represented as corresponding sets of three or more texture coordinates in the texture space region 92. As shown in FIG. 9B, the user in this example has selected two additional object polygons 96 and 97 on either side of object polygon 94 and their respective mappings into the texture space are displayed accordingly in texture space region 92.
  • Similarly in the example of FIG. 10, which shows the texture mapping for a top end surface of a cylinder [0046] 100, the user has selected all of the 16 wedge-shaped object polygons 101 that form the cylinder's end surface. The corresponding mappings into the texture space for the selected object polygons are represented by the circular mesh 102 formed from 17 texture coordinates as displayed in the texture space region 92.
  • Once the user has selected one or more object polygons for which texture mapping manipulation is desired, the user then can modify the texture mappings for those object polygons using any of three basic modes: translation, scaling or rotation. [0047]
  • FIGS. [0048] 11A-11B illustrate an example of modifying a texture mapping using translation, as indicated by the active state of the arrow button 118 in the Texture Applicator window 90. As shown in FIG. 11A, the user has selected in object space region 91 a single object polygon 110 which corresponds to an entire face of cube 112. The associated mapping into the texture space for object polygon 110 is indicated in texture space region 92 by four texture coordinates (113, 114, 115, 116) which form a square region 111. The user can modify the texture mapping for object polygon 110 by moving one or more of the four texture coordinates (113, 114, 115 or 116) within the texture space region 92.
  • As indicated by the surrounding red circle and the adjacent cursor, the user in this example has selected a single texture coordinate [0049] 113. The user then modifies the texture mapping by dragging texture coordinate 113 toward the center of the texture map thereby transforming the square region 111 into an irregularly shaped region 117, as shown in FIG. 11B. As the user drags the texture coordinate 113 with the cursor, the display of the cube 112 in the object space region 91 is updated in real time to reflect the current texture mapping as it is being modified. In this example, the user's cursor manipulations have caused a reduced area of the texture map (i.e., the area bounded by region 117 in FIG. 11B) to be mapped to face 110 of cube 112. Because the area of the texture map bounded by region 117 is now smaller than the area of face 110, the designated portion of the texture map is stretched to cover face 110, giving the cube 112 the appearance shown in Fig. liB.
  • In FIGS. [0050] 12A-12B, which show another example of modifying a texture mapping using translation, the user has designated a face 121 of a cube 120 as the object polygon for which texture mapping is to be modified. The corresponding mapping into the texture is represented by four texture coordinates (123, 124, 125 and 126) which form region 122, and which occupy only a subset of the overall texture map 127. In this example, the user has selected all four of the texture coordinates (123, 124, 125 and 126) that form the region 122 and has dragged them as a group to a new location on the texture map 127 as shown in FIG. 12B. In response, the display of the cube 120 in the object space region 91 is changed in real time to reflect the modified texture mapping, which in this case does not require any stretching or compression of the texture but rather simply reflects a different subset of the texture map 127.
  • FIGS. [0051] 13A-13B illustrate an example of modifying a texture mapping using scaling, as indicated by the active state of the scaling button 130 in the Texture Applicator window 90. As shown in FIG. 13A, the user has selected a single object polygon 131 which corresponds to an entire face of cube 132. The associated mapping into the texture space for object polygon 131 is indicated in texture space region 92 by the four texture coordinates forming region 133. The user then can modify the texture mapping for the selected object polygon by scaling the four texture coordinates forming region 133 as desired (larger or smaller) about a point (the “scale-about” point 134) specified by the user. Positioning of the scale-about point is accomplished by clicking on a button 119 (or simply by clicking the cursor on the scale-about point provided it is visible in the current view) and then dragging the scale-about point to the desired location.
  • As indicated by the surrounding red box, the user in this example has positioned the scale-about point [0052] 134 at the center of region 133 (which happens to coincide with the center of the texture map 135). To cause the selected face 131 of the cube 132 to be mapped to a smaller portion of the texture map 135 (equivalently, to reduce the area bounded by region 133), the user clicks the mouse button down anywhere within the texture space region 92 (in this example, at click-down point 136 in FIG. 13A) and drags the cursor toward the scale-about point 134. Upon clicking down, visual feedback is provided to the user in the form of a four-headed arrow 137 (to indicate that the cursor may be moved in any direction) and a box 138 formed of blue-andwhite dashed lines which rubber-band to continuously occupy the space between the scale-about point 134 and the current cursor point as the user moves the cursor within texture space region 92 (to give the user a sense of the relative distance over which scaling has occurred).
  • As shown in FIG. 13B, the user in this example has dragged the cursor from the click-down point [0053] 136 to a point 139, and the region 133 that corresponds to the texture mapping accordingly has shrunk to roughly a quarter of its original size. The resultant texture mapping for face 131, showing an effective magnification of the texture relative to its appearance in FIG. 13A, is displayed in object space region 91.
  • Alternatively, the user could have increased the size of the area bounded by region [0054] 133 (i.e., the area of the texture map to which a selected object polygon is mapped) by dragging the cursor away from the designated scale-about point. In that case, a larger portion of the texture map 135 would have been mapped to face 131 as a result.
  • If a user attempts to increase the area of a region formed from texture coordinates beyond the size of a single instance of the texture map, the Texture Applicator utility repeats the texture map automatically as needed. As shown in FIG. 14, for example, after selecting face [0055] 141 of cube 140 for texture mapping modification, the user has dragged one of the points (point 143) that defines the region 142 to the right beyond the border of the texture map 144. In response, the texture map 144 is repeated automatically causing another instance 145 of the texture map to appear in the texture space region 92. In response to user interactions, the texture map will continue to be repeated automatically in any or all directions to accommodate virtually any size of mapping. Similarly, repeated instances of the texture map will disappear as needed in response to a reduction in the size of the mapping.
  • FIGS. [0056] 15A-15B illustrate an example of modifying a texture mapping using rotation, as indicated by the active state of the rotation button 159 in the Texture Applicator window 90. As shown in FIG. 15A, the user has selected a single object polygon 151 which corresponds to an entire face of cube 150. The associated mapping into the texture space for the face 151 is indicated in texture space region 92 as four texture coordinates forming a square region 152. The user then can modify the texture mapping for the face 151 by rotating selected texture coordinates as desired (clockwise or counterclockwise) about a point (the “rotation point”) specified by the user. Positioning of the rotation point is accomplished in the same manner as positioning of the scale-about point during scaling--namely, by clicking on the button 119 (or simply by clicking the cursor on the rotation point provided it is visible in the current view) and then dragging the rotation point to the desired location.
  • As indicated by the surrounding red box, the user in this example has positioned the rotation point [0057] 153 at the center of region 152 (which happens to coincide with the center of the texture map 154). To cause the texture coordinates forming region 152 to rotate to occupy a different subset of the texture map 154, the user clicks the mouse button down anywhere within the texture space region 92 (in this example, at click-down point 155 in FIG. 15A) and drags the cursor in a counterclockwise direction about the rotation point 153. Upon clicking down, visual feedback is provided to the user in the form of a partial circle 156 formed of two arced arrows (to indicate that the cursor may be rotated in either direction) and a blue-and-white dashed line 157 which rubber-bands to continuously extend between the rotation point 153 and the current cursor point as the user moves the cursor within texture space region 92 (to give the user a sense of the relative degree to which rotation has occurred).
  • As shown in FIG. 15B, the user in this example has dragged the cursor from click-down point [0058] 155 to a point 158 and the four texture coordinates forming region 152 have rotated about the rotation point 153 by roughly 45 degrees from its original orientation. The resultant texture mapping for face 151 is displayed in object space region 91.
  • In the default state, each object polygon in a 3D object shares both object coordinates and texture coordinates with its neighboring polygons. As a result, when the texture mapping for one polygon is changed, the texture mappings for all neighboring polygons also change necessarily. Sometimes this behavior is desirable but other times it is not. The Texture Applicator allows users to sever shared texture coordinate relationships between object polygons selectively (referred to as “breaking instancing”), thus enabling each object polygon (or a combination of two or more object polygons) to have a separate texture mapping independent of the texture mappings for other object polygons in the 3D object. [0059]
  • FIGS. [0060] 1GA-16E illustrate an example of breaking instancing between adjacent object polygons so that they no longer share texture coordinates in the texture space. In this example, the 3D object is a multi--fold screen 160 formed of 16 object polygons as shown in FIG. 16A. All 16 of the object polygons have been selected and the corresponding mapping into the texture space is represented by the grid 161 (formed of 25 corresponding texture coordinates) displayed in the texture space region 92. As shown in FIG. 16B, when the user drags the four texture coordinates that define region 163 (which corresponds to the mapping for object polygon 164) to a new location on the texture map 162, the eight regions surrounding region 163 deform to follow its movement. This is because the four texture coordinates that were moved are shared by the nine different object polygons. As a result, the texture mappings for object polygon 164 and its eight surrounding object polygons are changed in a corresponding manner.
  • If the user desires to modify the texture mapping for one or more object polygons without affecting the texture mappings of its neighboring object polygons, the user can break instancing between a designated object polygon and its neighbors by selecting the “break instancing” option under the Edit menu of the Texture Applicator utility. [0061]
  • In this example, the user has selected [0062] object polygon 164 and has broken instancing between object polygon 164 and its neighbors as shown in FIG. 16C. Consequently, the user now can modify the texture mapping for object polygon 164 without affecting the texture mappings of its neighbors by moving the corresponding four texture coordinates forming region 163 to the desired location on the texture map 162.
  • As shown in FIG. 16D, the user has moved the four texture coordinates which form region [0063] 163 slightly up and to the right thereby modifying the texture mapping for object polygon 164, but not for any of the other object polygons. As a result, the overall image on the screen 161 appears in FIG. 16D as a modified self-portrait of Vincent van Gogh having three eyes. As the user moves the texture coordinates 163 up further relative to the texture map 162 (to the top of van Gogh's head), the texture mapping for object polygon 164 changes accordingly as shown in FIG. 16E.
  • FIGS. [0064] 17A-17J illustrate an example of using the various texture manipulation features provide by the Texture Applicator to fine tune a texture mapping according to a user's preferences.
  • FIG. 17A shows the Texture Applicator window for a chess pawn [0065] 170 having a polka dot texture map 171 which was applied using the standard plane texture mapping function, and which has resulted in smearing of the texture across several object polygons in the region 172. To correct the smearing, and otherwise modify the standard texture mapping function that was applied to the pawn 170, the user has a number of options available, including being able to designate a “best fit” mapping function (under the “Mapping” menu choice) for applying the texture map 171 to one or more selected portions of the pawn 170. In this example, the user has not selected any of the object polygons forming the pawn 170 so any texture mapping modification that is performed will operate on all of the pawn's object polygons.
  • As shown in FIG. 17B, because the shape of the pawn most closely approximates a cylinder (rather than some other primitive), the user in this example has specified that the standard cylinder texture mapping function should be used to map the [0066] texture 171 onto the pawn. As a result of this new texture mapping, the appearance of the pawn 170 changes accordingly including a reduction in the amount of texture smearing in region 172.
  • Because a single standard texture mapping function ordinarily does not match an arbitrarily complex shape, applying different standard mapping functions to different portions of a 3D graphic often can produce more realistic results. In this example, the cylindrical mapping function applied to the pawn has resulted in an undesirable (e.g., non-realistic) mapping on the bottom of the pawn. The user accordingly has decided to modify the texture mapping on the bottom surface of the pawn [0067] 170 by applying a different standard mapping function exclusively to the bottom surface. To do so, the user first reorients the rendering of the pawn 170 in the object space region 91 (through direct object manipulations or by using the Rotx, Roty and Dolly thumbwheels as appropriate) to make the pawn's bottom surface 173 visible to the user as shown in FIG. 17C.
  • As shown in FIG. 17D, once the user has selected all of the [0068] 24 object polygons that form the pawn's bottom surface 173, the corresponding mappings into the texture space are displayed in the texture space region 92 as a red line 174 with small red blocks marking the associated texture coordinates. Because the 24 object polygons are mapped to a single line 174 in the texture space, the pawn's bottom surface appears to have little or none of the texture map applied to it.
  • To correct this situation, the user has specified a new “best fit” mapping function for the pawn's bottom surface--namely, the standard plane texture mapping function. In response, a standard plane texture mapping function is applied to the [0069] 24 object polygons forming the pawn's bottom surface 173, as indicated by the changed appearance of the pawn's bottom surface 173 in FIG. 17E. At the same time, the new texture mapping is displayed in the texture space region 92 as an octagonal mapping mesh 175 of 17 texture coordinates forming 24 regions which effectively are laid directly on top of the texture map 171.
  • The “Plane:Best” texture mapping function specified by the user is applied only to the currently selected portion of the object--in this example, the [0070] 24 object polygons forming the pawn's bottom surface 173. The texture mappings for the object's other, unselected polygons are unaffected. By selectively applying different standard texture mapping functions to different portions of the 3D object, a user is able to tailor the texture mappings for the larger features of the 3D object quickly and easily.
  • As shown in FIGS. 17F and 17G, the user next has decided that the texture dots on the bottom surface [0071] 173 of the pawn 170 appear too small relative to the other dots on the body of the pawn. To change the size of the dots on the bottom surface 173, the user scales the mapping mesh 175 (and thus all of the 17 texture coordinates forming the mesh) about its center point 17G down to roughly one-fourth of its original size, as shown in FIG. 17G. As a result, the texture dots on the pawn's bottom surface 173 are magnified to an extent that only about four texture dots can fit on the bottom surface 173.
  • Next, the user has decided to correct some of the remaining texture smearing in region [0072] 172 and so has re-oriented the display of the pawn 170 So that region 172 is readily visible as shown in FIG. 17H. --To correct the texture smearing, the user has selected five contiguous object polygons 177 on the top surface of the pawn's base, as shown in FIG. 17I. The corresponding mappings into the texture space are represented by the 12 texture coordinates 178 displayed in the texture space region 92. =Because the 12 texture coordinates represent a relatively fine degree of detail which might be too small for some users to see, the Texture Applicator utility has zoomed in on (i.e., magnified) the texture space region 92 automatically so that the texture coordinates are readily visible to the user. In general, whenever a user undertakes an action that causes a different set of texture coordinates to be displayed in the texture space region 92, the Texture Applicator utility displays the texture coordinates as large as possible to fit within the texture space region 92. This auto-zoom feature can be disabled by a user if desired. The Texture Applicator utility also provides a full range of viewing capabilities which allow users to manually zoom in/out, dolly, rotate, etc. as desired.
  • The texture smearing in this example is a result of mapping a relatively thin slice of a row of texture dots [0073] 179 across the relatively larger object polygons 177. As shown in FIG. 17J, the user has modified the texture mapping for the object polygons 177 by moving the 12 texture coordinates which form region 178 slightly upward to a location on the texture map 171 between two rows of texture dots (i.e., between row 179 and row 180). As a result, the texture mapping no longer requires the row 179 of texture dots to be stretched across the object polygons 177 and the smearing effect has been reduced dramatically.
  • Other embodiments are within the scope of the claims. For example, instead of permitting users to modify a texture mapping by selectively adjusting positions of 3D object points within a separately displayed texture space region, users instead (or in addition) could modify a texture mapping by selectively adjusting positions of texture coordinates within the object space itself. For example, the object polygons could be displayed in one color or format and the corresponding texture coordinates could be displayed in a different style or format. In other words, the functionality of the two [0074] display regions 91 and 92 in FIG. 9A could be combined, interchanged or otherwise modified as desired according to a software developer's preferences.
  • The techniques and mechanisms described here were implemented on Silicon Graphics machines using the Open Inventor Toolkit, Motif, and OpenGL, and the C++programming language. They are not limited to any particular hardware or software configuration, but rather they may find applicability in any computing environment in which graphical content may be created or manipulated. These techniques and mechanisms may be implemented in hardware or software, or a combination of the two. Preferably, implementation is achieved with computer programs executing on programmable computers that each include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), and suitable input and output devices. Program code is applied to data entered using the input device to perform the functions described and to generate output information. The output information is applied to one or more output devices. Each program is preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. [0075]
  • Each such computer program is preferably stored on a storage medium or device (e.g., CD-ROM, hard disk or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform the procedures described in this document. The system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium [0076] 50 configured causes a computer to operate in a specific and predefined manner.

Claims (44)

What is claimed is:
1. A computer-implemented method of applying a texture to a 3D graphic object, the method comprising:
providing a graphic indication of a mapping between a 3D graphic object and a texture map; and
modifying the mapping based on a manipulation of the graphic indication.
2. The method of claim 1 further comprising rendering the 3D graphic object based on the modified mapping.
3. The method of claim 2 wherein the rendering occurs substantially simultaneously with the modification of the mapping.
4. The method of claim 2 wherein the modification of the mapping and the rendering of the 3D graphic object occur in real time.
5. The method of claim 1 further comprising, prior to the modification of the mapping, receiving through a graphic user interface input specifying the manipulation of the graphic indication.
6. The method of claim 5 wherein the input specifying the manipulation of the graphic indication is received from a user of a computer system.
7. The method of claim 1 further comprising presenting a first display region representative of a texture space and at least one other display region representative of an object space.
8. The method of claim 7 further comprising displaying the texture map in the first display region and displaying the 3D graphic object in the other display region.
9. The method of claim 8 wherein providing the graphic indication of the mapping comprises displaying in the first display region a graphic element representative of the 3D graphic object at a location on the texture map corresponding to the mapping.
10. The method of claim 9 wherein manipulation of the graphic indication comprises performing a direct manipulation operation on the graphic element displayed in the first display region.
11. The method of claim 10 wherein performing the direct manipulation operation comprises translating the graphic element to a new location on the texture map.
12. The method of claim 10 wherein performing the direct manipulation operation comprises scaling the graphic element relative to the texture map.
13. The method of claim 12 further comprising receiving input from a user regarding a point about which the scaling is to be performed.
14. The method of claim 10 wherein performing the direct manipulation operation comprises rotating the graphic element relative to the texture map.
15. The method of claim 14 further comprising receiving input from a user regarding a point about which the rotating is to be performed.
16. The method of claim 10 further comprising providing visual feedback as the direct manipulation operation is being performed.
17. The method of claim 10 further comprising automatically repeating an instance of the texture map when the direct manipulation operation causes the graphic element to extend beyond a boundary of the texture map.
18. The method of claim 1 wherein the modifying comprises altering only a portion of the mapping between the 3D graphic object and the texture map.
19. The method of claim 1 wherein the modification of the mapping comprises modifying a mapping between coordinates in an object space and coordinates in a texture space.
20. The method of claim 19 wherein the 3D graphic object comprises a plurality of polygons defined by coordinates in the object space, and wherein a set of coordinates in the object space and a set of coordinates in the texture space are shared by at least two polygons in the object space.
21. The method of claim 20 wherein modification of the mapping for a polygon also modifies the mapping for an adjacent polygon.
22. The method of claim 19 wherein the 3D graphic object comprises a plurality of polygons defined by coordinates in the object space, and wherein modification of the mapping comprises altering the mapping for a polygon without altering the mapping for an adjacent polygon.
23. The method of claim 19 wherein the 3D graphic object comprises a plurality of polygons defined by coordinates in the object space, and wherein the method further comprises receiving input from a user specifying whether a set of coordinates in the object space and a set of coordinates in the texture space are to be shared between polygons.
24. The method of claim 19 wherein the 3D graphic object comprises a plurality of polygons defined by coordinates in the object space, and wherein the method further comprises receiving input from a user specifying one or more polygons for which the mapping is to be modified.
25. A computer-implemented method of applying a texture to a 3D graphic object, the method comprising:
providing a visual indication of a correlation between a 3D graphic object and a texture map; and
modifying the correlation by selectively applying a predetermined mapping function between a texture space and an object space to a portion of a 3D graphic object.
26. The method of claim 25 wherein modifying the correlation comprises selectively applying one of a plurality of predetermined mapping functions between a texture space and an object space to different portions of the 3D graphic object.
27. The method of claim 25 further comprising receiving user input that specifies one of a plurality of predetermined mapping functions between a texture space and an object space to be applied to the portion of the 3D graphic object.
28. The method of claim 27 further comprising receiving further user input that specifies one of a plurality of different portions of the 3D graphic object to which the specified mapping function is to be applied.
29. The method of claim 25 further comprising:
pre se nting a graphical user int erface to a user of a computer system; and
rec eiving input from the user designating a portion of the 3D graphic object and specifying at least one of a plurality of predetermined mapping functions to be applied to the designated portion of the 3D graphic object.
30. The method of claim 25 further comprising rendering the 3D graphic object based on the modified correlation.
31. A user interface for applying a texture to a 3D graphic object, the user interface comprising:
an object region displaying a 3D graphic object formed of polygons;
a texture region displaying a texture map; and
a marker within the texture region defining a mapping between a polygon of the 3D graphic object and the texture map.
32. The user interface of claim 31 wherein the marker comprises a graphic indication.
33. The user interface of claim 31 wherein the marker appears visually similar to a corresponding polygon forming the 3D graphic object.
34. The user interface of claim 31 wherein movement of the marker within the texture map causes a corresponding change in the mapping between the polygon of the 3D graphic object and the texture map.
35. The user interface of claim 31 further comprising a plurality of markers within the texture region each of which defines a mapping between a corresponding polygon of the 3D graphic object and the texture map.
36. The user interface of claim 35 wherein one of the markers can be moved independently of the other markers.
37. The user interface of claim 35 wherein movement of one of the markers affects the mappings of adjacent markers.
38. The user interface of claim 35 wherein movement of one of the markers does not affect the mappings of adjacent markers.
39. The user interface of claim 31 comprising an abstraction through which a user can move the marker within the texture region.
40. The user interface of claim 39 wherein the abstraction corresponds to translating movement.
41. The user interface of claim 39 wherein the abstraction corresponds to scaling movement.
42. The user interface of claim 39 wherein the abstraction corresponds to rotating movement.
43. Computer software for a 3D graphic content development system, the computer software residing on a computer-readable medium and comprising instructions for causing a computer to perform the following operations:
provide a graphic indication of a mapping between a 3D graphic object and a texture map;
receive input from a user of the 3D graphic content development system specifying a manipulation of the graphic indication;
based on the specified manipulation of the graphic indication, modify the mapping between the 3D graphic object and the texture map; and
render a view of the 3D object using the modified mapping.
44. A 3D graphic content development system comprising:
a graphical user interface for receiving input in the form of cursor manipulations from a user of the system;
an interactive development environment for creating or modifying a 3D graphic object based on input received from the user through the graphical user interface;
a texture applicator utility for applying a texture map to the 3D graphic object by establishing a correlation between coordinates of the 3D graphic object and coordinates 11 of the texture map; and
a texture manipulator utility that displays a graphic representation of the correlation between coordinates of the 3D graphic object and coordinates of the texture map and which enables the user to modify the correlation by selectively altering features of the graphic representation of the correlation through corresponding cursor manipulations.
US08/903,440 1997-07-30 1997-07-30 Texture mapping 3d graphic objects Abandoned US20020154132A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US08/903,440 US20020154132A1 (en) 1997-07-30 1997-07-30 Texture mapping 3d graphic objects
US11/227,779 US7148899B2 (en) 1997-07-30 2005-09-14 Texture mapping 3D objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08/903,440 US20020154132A1 (en) 1997-07-30 1997-07-30 Texture mapping 3d graphic objects

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/227,779 Continuation US7148899B2 (en) 1997-07-30 2005-09-14 Texture mapping 3D objects

Publications (1)

Publication Number Publication Date
US20020154132A1 true US20020154132A1 (en) 2002-10-24

Family

ID=25417512

Family Applications (2)

Application Number Title Priority Date Filing Date
US08/903,440 Abandoned US20020154132A1 (en) 1997-07-30 1997-07-30 Texture mapping 3d graphic objects
US11/227,779 Expired - Fee Related US7148899B2 (en) 1997-07-30 2005-09-14 Texture mapping 3D objects

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/227,779 Expired - Fee Related US7148899B2 (en) 1997-07-30 2005-09-14 Texture mapping 3D objects

Country Status (1)

Country Link
US (2) US20020154132A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030160785A1 (en) * 2002-02-28 2003-08-28 Canon Europa N.V. Texture map editing
US20040095348A1 (en) * 2002-11-19 2004-05-20 Bleiweiss Avi I. Shading language interface and method
US20040104916A1 (en) * 2002-10-29 2004-06-03 Canon Europa N.V. Apparatus and method for generating texture maps for use in 3D computer graphics
GB2407953A (en) * 2003-11-07 2005-05-11 Canon Europa Nv Texture data editing for three-dimensional computer graphics
US6975334B1 (en) * 2003-03-27 2005-12-13 Systems Paving Method and apparatus for simulating the appearance of paving stone on an existing driveway
US7091984B1 (en) * 2004-03-11 2006-08-15 Nvidia Corporation Scalable desktop
US20070057939A1 (en) * 2005-09-09 2007-03-15 Microsoft Corporation 2D/3D combined rendering
US7626589B2 (en) * 2003-12-10 2009-12-01 Sensable Technologies, Inc. Haptic graphical user interface for adjusting mapped texture
US20100080489A1 (en) * 2008-09-30 2010-04-01 Microsoft Corporation Hybrid Interface for Interactively Registering Images to Digital Models
US7710415B2 (en) 2001-01-08 2010-05-04 Sensable Technologies, Inc. Systems and methods for three-dimensional modeling
US7800609B2 (en) 1996-08-02 2010-09-21 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with a haptic virtual reality environment
US7808509B2 (en) 2003-10-30 2010-10-05 Sensable Technologies, Inc. Apparatus and methods for stenciling an image
US20100289798A1 (en) * 2009-05-13 2010-11-18 Seiko Epson Corporation Image processing method and image processing apparatus
US7864173B2 (en) 1998-07-17 2011-01-04 Sensable Technologies, Inc. Systems and methods for creating virtual objects in a sketch mode in a haptic virtual reality environment
US7889209B2 (en) * 2003-12-10 2011-02-15 Sensable Technologies, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US20110298800A1 (en) * 2009-02-24 2011-12-08 Schlichte David R System and Method for Mapping Two-Dimensional Image Data to a Three-Dimensional Faceted Model
US20120081384A1 (en) * 2005-03-04 2012-04-05 Arm Norway As Method of and apparatus for encoding and decoding data
US20130162633A1 (en) * 2003-12-10 2013-06-27 Geomagic, Inc. Apparatus and methods for adjusting a texture wrapping onto the surface of a virtual object
US8700996B2 (en) 1998-08-28 2014-04-15 Corel Corporation Real time preview
US20150161763A1 (en) * 2011-10-07 2015-06-11 Zynga Inc. 2d animation from a 3d mesh
US20150332460A1 (en) * 2007-11-30 2015-11-19 Microsoft Technology Licensing, Llc Interactive geo-positioning of imagery
US20170024925A1 (en) * 2015-07-21 2017-01-26 Makerbot Industries, Llc Three-dimensional surface texturing
US9802364B2 (en) 2011-10-18 2017-10-31 3D Systems, Inc. Systems and methods for construction of an instruction set for three-dimensional printing of a user-customizableimage of a three-dimensional structure
WO2020095292A1 (en) * 2018-11-08 2020-05-14 Avi Mordechai Sharir A method for transforming 3-dimensional image data into a 2-dimensional image
US20210375011A1 (en) * 2016-12-28 2021-12-02 Shanghai United Imaging Healthcare Co., Ltd. Image color adjustment method and system
US20220405999A1 (en) * 2021-06-21 2022-12-22 International Business Machines Corporation Uv map using weight painting

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8930844B2 (en) * 2000-08-22 2015-01-06 Bruce Carlin Network repository of digitalized 3D object models, and networked generation of photorealistic images based upon these models
US7523411B2 (en) * 2000-08-22 2009-04-21 Bruce Carlin Network-linked interactive three-dimensional composition and display of saleable objects in situ in viewer-selected scenes for purposes of object promotion and procurement, and generation of object advertisements
EP1520268A4 (en) * 2002-07-10 2009-05-20 Harman Becker Automotive Sys System for texturizing electronic representations of objects
EP1787263A1 (en) * 2004-08-10 2007-05-23 Allegorithmic Mesh design method and tool
US20080037066A1 (en) * 2006-08-10 2008-02-14 Sauer Charles M Method and Apparatus for Providing Three-Dimensional Views of Printer Outputs
US7961945B2 (en) * 2007-02-13 2011-06-14 Technische Universität München System and method for on-the-fly segmentations for image deformations
JP2011022726A (en) * 2009-07-14 2011-02-03 Sony Corp Image processing apparatus and method
US9471934B2 (en) * 2011-02-25 2016-10-18 Nokia Technologies Oy Method and apparatus for feature-based presentation of content
US9508196B2 (en) * 2012-11-15 2016-11-29 Futurewei Technologies, Inc. Compact scalable three dimensional model generation
US10891766B1 (en) * 2019-09-04 2021-01-12 Google Llc Artistic representation of digital data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764237A (en) * 1994-10-07 1998-06-09 Kaneko; Koichi Texture mapping apparatus computing texture address by fill address
US5973701A (en) * 1996-09-30 1999-10-26 Cirrus Logic, Inc. Dynamic switching of texture mip-maps based on pixel depth value
US6037948A (en) * 1997-03-07 2000-03-14 Silicon Graphics, Inc. Method, system, and computer program product for updating texture with overscan
US6348917B1 (en) * 1996-09-30 2002-02-19 Cirrus Logic, Inc Dynamic switching of texture mip-maps based on depth
US6683606B1 (en) * 1996-03-05 2004-01-27 Canon Kabushiki Kaisha Virtual architecture experience method and apparatus

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1595964A (en) 1977-03-17 1981-08-19 Micro Consultants Ltd Tv Special effects generator
FR2480545A1 (en) 1980-04-10 1981-10-16 Micro Consultants Ltd DEVICE AND METHOD FOR PRINTING ANGULAR DISPLACEMENT TO A TELEVISION IMAGE
DE3277247D1 (en) 1982-12-22 1987-10-15 Ibm Image transformations on an interactive raster scan or matrix display
US5448687A (en) 1988-09-13 1995-09-05 Computer Design, Inc. Computer-assisted design system for flattening a three-dimensional surface and for wrapping a flat shape to a three-dimensional surface
US5255352A (en) 1989-08-03 1993-10-19 Computer Design, Inc. Mapping of two-dimensional surface detail on three-dimensional surfaces
GB2240017A (en) 1990-01-15 1991-07-17 Philips Electronic Associated New, interpolated texture values are fed back to texture memories
US5333245A (en) 1990-09-07 1994-07-26 Modacad, Inc. Method and apparatus for mapping surface texture
US5230039A (en) 1991-02-19 1993-07-20 Silicon Graphics, Inc. Texture range controls for improved texture mapping
US5469535A (en) 1992-05-04 1995-11-21 Midway Manufacturing Company Three-dimensional, texture mapping display system
US5396590A (en) 1992-09-17 1995-03-07 Apple Computer, Inc. Non-modal method and apparatus for manipulating graphical objects
US5412765A (en) 1992-12-21 1995-05-02 General Electric Company Method for vector field visualization using time varying texture maps
US5606650A (en) 1993-04-22 1997-02-25 Apple Computer, Inc. Method and apparatus for storage and retrieval of a texture map in a graphics display system
US5550960A (en) 1993-08-02 1996-08-27 Sun Microsystems, Inc. Method and apparatus for performing dynamic texture mapping for complex surfaces
US5630043A (en) 1995-05-11 1997-05-13 Cirrus Logic, Inc. Animated texture map apparatus and method for 3-D image displays
US6002410A (en) 1997-08-25 1999-12-14 Chromatic Research, Inc. Reconfigurable texture cache

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764237A (en) * 1994-10-07 1998-06-09 Kaneko; Koichi Texture mapping apparatus computing texture address by fill address
US6683606B1 (en) * 1996-03-05 2004-01-27 Canon Kabushiki Kaisha Virtual architecture experience method and apparatus
US5973701A (en) * 1996-09-30 1999-10-26 Cirrus Logic, Inc. Dynamic switching of texture mip-maps based on pixel depth value
US6348917B1 (en) * 1996-09-30 2002-02-19 Cirrus Logic, Inc Dynamic switching of texture mip-maps based on depth
US6037948A (en) * 1997-03-07 2000-03-14 Silicon Graphics, Inc. Method, system, and computer program product for updating texture with overscan

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7800609B2 (en) 1996-08-02 2010-09-21 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with a haptic virtual reality environment
US8576222B2 (en) 1998-07-17 2013-11-05 3D Systems, Inc. Systems and methods for interfacing with a virtual object in a haptic virtual environment
US7864173B2 (en) 1998-07-17 2011-01-04 Sensable Technologies, Inc. Systems and methods for creating virtual objects in a sketch mode in a haptic virtual reality environment
US8700996B2 (en) 1998-08-28 2014-04-15 Corel Corporation Real time preview
US9092119B2 (en) 1998-08-28 2015-07-28 Corel Software LLC Real time preview
US7710415B2 (en) 2001-01-08 2010-05-04 Sensable Technologies, Inc. Systems and methods for three-dimensional modeling
US20030160785A1 (en) * 2002-02-28 2003-08-28 Canon Europa N.V. Texture map editing
US7561164B2 (en) * 2002-02-28 2009-07-14 Canon Europa N.V. Texture map editing
US20040104916A1 (en) * 2002-10-29 2004-06-03 Canon Europa N.V. Apparatus and method for generating texture maps for use in 3D computer graphics
US7019754B2 (en) * 2002-10-29 2006-03-28 Canon Europa N.V. Apparatus and method for generating texture maps for use in 3D computer graphics
US20040095348A1 (en) * 2002-11-19 2004-05-20 Bleiweiss Avi I. Shading language interface and method
US6975334B1 (en) * 2003-03-27 2005-12-13 Systems Paving Method and apparatus for simulating the appearance of paving stone on an existing driveway
US7808509B2 (en) 2003-10-30 2010-10-05 Sensable Technologies, Inc. Apparatus and methods for stenciling an image
GB2407953A (en) * 2003-11-07 2005-05-11 Canon Europa Nv Texture data editing for three-dimensional computer graphics
US8456484B2 (en) 2003-12-10 2013-06-04 3D Systems, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US7889209B2 (en) * 2003-12-10 2011-02-15 Sensable Technologies, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US8174535B2 (en) 2003-12-10 2012-05-08 Sensable Technologies, Inc. Apparatus and methods for wrapping texture onto the surface of a virtual object
US20130162633A1 (en) * 2003-12-10 2013-06-27 Geomagic, Inc. Apparatus and methods for adjusting a texture wrapping onto the surface of a virtual object
US7626589B2 (en) * 2003-12-10 2009-12-01 Sensable Technologies, Inc. Haptic graphical user interface for adjusting mapped texture
US8963958B2 (en) * 2003-12-10 2015-02-24 3D Systems, Inc. Apparatus and methods for adjusting a texture wrapped onto the surface of a virtual object
US7091984B1 (en) * 2004-03-11 2006-08-15 Nvidia Corporation Scalable desktop
US20120081384A1 (en) * 2005-03-04 2012-04-05 Arm Norway As Method of and apparatus for encoding and decoding data
US8289343B2 (en) * 2005-03-04 2012-10-16 Arm Norway As Method of and apparatus for encoding and decoding data
US7999807B2 (en) 2005-09-09 2011-08-16 Microsoft Corporation 2D/3D combined rendering
US20070057939A1 (en) * 2005-09-09 2007-03-15 Microsoft Corporation 2D/3D combined rendering
US20150332460A1 (en) * 2007-11-30 2015-11-19 Microsoft Technology Licensing, Llc Interactive geo-positioning of imagery
US20100080489A1 (en) * 2008-09-30 2010-04-01 Microsoft Corporation Hybrid Interface for Interactively Registering Images to Digital Models
US20110298800A1 (en) * 2009-02-24 2011-12-08 Schlichte David R System and Method for Mapping Two-Dimensional Image Data to a Three-Dimensional Faceted Model
US9305390B2 (en) * 2009-02-24 2016-04-05 Textron Innovations Inc. System and method for mapping two-dimensional image data to a three-dimensional faceted model
US20100289798A1 (en) * 2009-05-13 2010-11-18 Seiko Epson Corporation Image processing method and image processing apparatus
US20150161763A1 (en) * 2011-10-07 2015-06-11 Zynga Inc. 2d animation from a 3d mesh
US9652880B2 (en) * 2011-10-07 2017-05-16 Zynga Inc. 2D animation from a 3D mesh
US9802364B2 (en) 2011-10-18 2017-10-31 3D Systems, Inc. Systems and methods for construction of an instruction set for three-dimensional printing of a user-customizableimage of a three-dimensional structure
US20170024925A1 (en) * 2015-07-21 2017-01-26 Makerbot Industries, Llc Three-dimensional surface texturing
US9934601B2 (en) * 2015-07-21 2018-04-03 Makerbot Industries, Llc Three-dimensional surface texturing
US20180253887A1 (en) * 2015-07-21 2018-09-06 Makerbot Industries, Llc Three-dimensional surface texturing
US20210375011A1 (en) * 2016-12-28 2021-12-02 Shanghai United Imaging Healthcare Co., Ltd. Image color adjustment method and system
WO2020095292A1 (en) * 2018-11-08 2020-05-14 Avi Mordechai Sharir A method for transforming 3-dimensional image data into a 2-dimensional image
US20220405999A1 (en) * 2021-06-21 2022-12-22 International Business Machines Corporation Uv map using weight painting

Also Published As

Publication number Publication date
US20060087505A1 (en) 2006-04-27
US7148899B2 (en) 2006-12-12

Similar Documents

Publication Publication Date Title
US7148899B2 (en) Texture mapping 3D objects
US5903270A (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
US8514238B2 (en) System and method for adding vector textures to vector graphics images
US6434277B1 (en) Image processing apparatus and method, and medium therefor
US5012433A (en) Multistage clipping method
EP0887771B1 (en) Method and apparatus for composing layered synthetic graphics filters
US7199793B2 (en) Image-based modeling and photo editing
US5969722A (en) Methods and apparatus for creation of three-dimensional wire frames and for three-dimensional stereo morphing
US6529206B1 (en) Image processing apparatus and method, and medium therefor
US6867787B1 (en) Character generator and character generating method
US5425137A (en) System and method for processing images using computer-implemented software objects representing lenses
EP1703469A1 (en) Generating a wipe effect using a 3D model
US7616201B2 (en) Casting shadows
JP2002507799A (en) Probabilistic level of computer animation
US11176722B2 (en) Composing an animation scene in a computer-generated animation
US8698830B2 (en) Image processing apparatus and method for texture-mapping an image onto a computer graphics image
US20130016098A1 (en) Method for creating a 3-dimensional model from a 2-dimensional source image
US7663638B2 (en) Stroked fill
EP0887770B1 (en) Method and apparatus for defining the scope of operation of layered synthetic graphics filters
JPH05143711A (en) Image generating method
JP2003504697A (en) Anti-aliasing of subsampled texture edges
JP2000339499A (en) Texture mapping and texture mosaic processor
EP0978102A2 (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
JPH07234949A (en) Method and system for supporting preparation of perspective drawing
WO1997045782A8 (en) Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface

Legal Events

Date Code Title Description
AS Assignment

Owner name: SILICON GRAPHICS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUMESNY, ALAIN M.;FOUTS, CHRISTOPHER L.;REEL/FRAME:009032/0344

Effective date: 19970730

AS Assignment

Owner name: SILICON GRAPHICS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST RE-RECORD TO CORRECT THE RECORDATION DATE OF 3-10-98 TO 3-11-98 PREVIOUSLY RECORDED AT REEL 9032, FRAME 344;ASSIGNORS:DUMESNY, ALAIN M.;FOUTS, CHRISTOPHER L.;REEL/FRAME:009213/0541

Effective date: 19970730

AS Assignment

Owner name: PLATINUM TECHNOLOGY IP, INC., ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PLATINUM TECHNOLOGY, INC.;REEL/FRAME:009670/0906

Effective date: 19981223

Owner name: PLATINUM TECHNOLOGY, INC., ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SILICON GRAPHICS, INC.;REEL/FRAME:009678/0261

Effective date: 19981223

AS Assignment

Owner name: COMPUTER ASSOCIATES THINK, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PLATINUM TECHNOLOGY IP, INC.;REEL/FRAME:010351/0936

Effective date: 19991028

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE