CA2415912A1 - Animation editing apparatus - Google Patents

Animation editing apparatus Download PDF

Info

Publication number
CA2415912A1
CA2415912A1 CA002415912A CA2415912A CA2415912A1 CA 2415912 A1 CA2415912 A1 CA 2415912A1 CA 002415912 A CA002415912 A CA 002415912A CA 2415912 A CA2415912 A CA 2415912A CA 2415912 A1 CA2415912 A1 CA 2415912A1
Authority
CA
Canada
Prior art keywords
editing
simulated
animation
visual display
labels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002415912A
Other languages
French (fr)
Inventor
Andre Gauthier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alias Systems Corp
Original Assignee
Kaydara Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kaydara Inc filed Critical Kaydara Inc
Publication of CA2415912A1 publication Critical patent/CA2415912A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Abstract

Animation editing apparatus for editing animation data, comprising data storage means, processing means, visual display means and a manually responsive input device configured to allow a user to indicate a selected point on the visual display means.
The visual display means displays an image representing a simulated three-dimensional world-space including a plurality of simulated objects, and the manually responsive input device provides an input signal indicating a location within the image corresponding to one of the simulated objects. In response to receiving the input signal, the processing means identifies the selected simulated object, and retrieves data from the data storage means of one or more related items related to the selected simulated object within a defined degree of relationship. The visual display means displays labels identifying the selected simulated object and the related items only.

Description

animation Editing Apparatus Background of the Invention 1. Field of the Invention The present invention relates to animation editing apparatus for editing animation data, and a method of editing animation data in a data processing system.
2. Description of the Related Art Computerised systems for the editing of animation data have been used for some time. In order to provide a human editor access to the required editable parameters of an animation, it is known for such systems to display a hierarchical representation of items defining a whole scene. The problem with this approach, is that for a complex, high resolution scene, the editor may be ~ s confronted with a hierarchy containing thousands of items representing hundreds of simulated objects and their associated attributes.
Brief Summary of the Invention According to a first aspect of the present invention there is provided 2o animation editing apparatus for editing animation data, said apparatus comprising data storage means, processing means, visual display means and manually responsive input device configured to allow a user to indicate a selected point on the visual display means, wherein: said visual display means is configured to display an image representing a simulated three 2~ dimensional world-space including a plurality of simulated objects; said manually responsive input device is configured to provide an input signal indicating a location within said image corresponding to one of said simulated objects; said processing device is configured to identify the selected simulated object in response to receiving said input signal, and to retrieve data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and said visual display means is configured to display labels identifying the selected simulated object and said related items onlya Brief Description of the Several Views of tt~e Drawings Figure 7 shows an animation artist 101 equipped with a computer system 102 for editing animated graphics;
Figure 2 shows an example of an animation to be edited by the user 101;
Figure 3 shows details of computer sy stem 102;
~5 Figure 4 shows a flow chart outlining the operation of the system 102;
Figures 5A and 5B show data contained within main memory 302 following step 404 of Figure 4;
Figure 6 shows a table representing a database containing 2o hierarchy data X10;
Figure 7 shows a scene tree according to the prior art;
Figure 8 shows a graphical user interface (GUI) 801 produced by the application program and displayed on the visual display unit 104;
Figure 9 shows the navigation window 804 after user selection of 25 the label for the Whitehouse texture;
Figure 70 shows the navigation window 804 after selection of the "GUBE1" label 813;
Figure 77 shows the step 405 of editing project data in greater detail;
Figure ~2 shows more detail of the step 1104 of updating the user interface;
Figure 93 shows further detail of the :step 1204 of displaying labels for selected simulated object and directly related objects;
Figure 74 shows further detail of the step 1206 of displaying a selected label with directly related items;
Figure 'S shows a second example of an animation project to be edited by the user 101;
Figure 76 shows a conventional scene tree representing the animation of Figure ~'S;
Figure 97 shows the visual display Llnlt 1O4 as it appears during editing of the animated character 1501; and Figure 78 shows the navigation window 804 after the user has double clicked on the label "BODY-BB07" 1702.
2o Best IVlode for Carrying ~ut the Invention Figure An animation artist 101 equipped with a computer system 102 for editing animated graphics is shown in Figure 7. The artist 101 inputs signals to tha computer system 102 by manual operation of a mouse 103, 25 while viewing a graphical user interface on a visual display unit 104. In response to receiving the input signals from the mouse 103, the system 102 performs functions including: editing animation data; presenting selected editing tools on the user interlace; and presenting representations of selected items to be edited.
As an alternative to using a mouse 103, the artist 101 could be provided with a stylusltouch-tablet combination, or a trackball or similar manually responsive input device.
Figure 2 An example of an animation to be edited by the user 101 is shown ~o in Figure 2. The animation comprises three simulated three dimensional objects 201, 202 and 203 within a simulated three dimensional world space. The first of the simulated objects is a cube 201 which is animated such that it rotates about a vertical axis passing through its upper face 204, while apparently making movements towards the viewer. Consequently, during the animation, five of its six faces may be viewed. A bit map image has been texture mapped onto each of the five viewable faces, and thus, for example face 204 appears to comprise <~ two dimensional image of a battleship, while faces 20~ and 200 comprise images of a spaceshuttle and the Whitehouse respectively. The two remaining simulated objects 202 and 203 are spheres which are animated such that they appear to rotate about the cube 201, in a similar manner to satellites orbiting a planet. The spheres have also had bit map images mapped onto their surfaces, so that sphere 203 supports the same image as cube face 206, while sphere 202 supports the same image as one of the other cube faces presently not 2~ viewable.

Figure 3 Computer system 102 is detailed in Figure 3. It includes a central processing unit 301 such as an Intel Pentium 4 processor or similar.
Central processing unit 301 receives instructions from memory 302 via a 5 system bus 303. On power-up, instructions are read to memory 302 from a hard disk drive 304., Programs are loaded to the hard disk drive 304 by means of a CD-RONI received within a CD RC>M drive 305. Output signals to the display unit are supplied via a graphics card 306 and input signals from the mouse 103, similar devices and a keyboard are received via input card 1o 307. The system also includes a zip drive 308 and a network card 309, each configured to facilitate the transfer of data into and out of the system.
The present invention is embodied by an animation editing program installed from a CD ROM 310 via the CD-ROM drive 305.
Figure 4 A flow chart outlining the operation of the system 102 is shown in Figure 4. Following the system 102 being switched on at step 401, instructions are loaded from the hard disk drive 304 into main memory 302, at step 402. Upon completion of the loading operation at step 402, the 2o application starts at step 403, and a graphical user interface for editing animation projects is initialised. At step 404, existing project data is loaded into memory such that it may be edited at step 405. The updated project data is then stored at step 406. At step 407 it is determined whether or not a further project is to be edited, and if so, steps 404 to 407 are repeated.
Otherwise, the system is shut down at step 408.

2005-P10&CA
Figures 5A and 5B
Data contained within main memory 302 following step 404 is shown in Figure 5A. Thus following the loading steps 402 and 404, the memory 302 contains an operating system 501, which may be for example Windows or Linux, the application program 502 by which animated graphics may be edited as described below, and project data 503 defining the particular animation being edited.
The project data 503 is illustrated in greater detail in Figure 58. The project data comprises: hierarchy data 510; three-dimensional model data 511; materials data 512; animation data 513 and other node data 514. The three-dimensional model data 511 defines the geometry of simulated objects appearing in the graphic. Thus, for example, during editing of the animation of Figure 2, the model data 511 comprises data defining the shape and dimensions of objects 201, 202 and 203. Materials data 512 ~5 comprises data defining colours, textures or patterns appearing in the animation, such as the Whitehouse, battleship, spaceshuttle, and Astronaut textures used in the animation of Figure 2. The animation data 513 defines the simulated three-dimensional movement of the simulated objects appearing in the animation. Thus, for example, animation data 513 may 2o define the manner in which the cube 201 rotates and the manner in which the spheres 202 and 203 orbit the cube. ether node data 514 defines other characteristics of the animation, such as: the set in which the simulated objects appear; the lights; the cameras; etc.
The hierarchy data 510, defines relationships existing between 2s items defined by data 511, 512, 513 and 514. Thus, the hierarchy data 510, defines relationships between simulated objects defined by data 511, material items defined by data 512, animations defined by data 513, and lights, cameras etc. defined by data 574.
Figure 6 The hierarchy data may be stored within the memory of the system 102 as a database. A table representing a database containing hierarchy data 510 is shown in Figure 6. Each item defining an animation is provided with a node label, suitable for identifying the item to a user such as artist 101, and a node identity.
The example provided in Figure 6 corresponds to the animation of Figure 2. Thus, there are three items given the node labels "CUBE1 °', "SPHERE1" and "SPHERE2" defining the simulated cube 201 and spheres 202 and 203. There are also five items, with node identities 11, 12, 13, 14 and 15, defining textures, and three items wii:h node identity 10, 16 and 17 defining the animation of the three simulated objects. One light and one camera have been defined for the present animation and these are identified by node identities 8 and 3.
Two other items labelled "target scene" and "scene renderer" are also included in the database. The "target scene" defines the overall 2o composition of the animation. The "scene renderer" is a process for rendering the three dimensional animation, dE:fined by the target scene, into a two dimensional animated image that is suitable for display.
It should be understood that an item may be a data-set, or a process which defines a part of an animation, such as a simulated object, an attribute of an object, the overall composition of the animation or the rendering of the animation.
The relationships existing between the items of the database are illustrated by the third column, "PARENT OF" and fourth column "CHILD
OF" of the table. The two relationships are the opposite of each other, and thus, if item "A" is a parent of item "B", then item "B" is a child of item "A".
For example, the sixth fine shows that "SPHERE1" is the parent of "Texture-Whitehouse" (node identity 11 ) a.nd "Animation-Orbit H" (node identity 16), while line 11 shows that "Texture-Whitehouse" is a child of "SPHERE1" (node !D 6) and line 16 shows that "Animation-Orbit H" is a child of "SPHERE1" (nods ID fi). Thus the attributes of objects are children of the object.
The two spheres, "SHERE1" and "SHERE2" have been constrained to the object "CUBE1" such that they follow the movement of "CUBE1"
during the animation. An offset constraint is used so that the spheres are held apart from "CUBE1". Because the spheres are constrained to "CUBE1", they are its children, as indicated in Figure 6 .
The database therefore contains data indicating which items are directly related and the nature of the relationship, that is, "parent of or "child of'. The database is theretore designed such that given the node identity of an item, the node identities of its children may be looked up in zo the third column, or its parent may be looked up in the fourth column.
Figure 7 A scene tree according to the prior art is shown in Figure 7. Scene trees provide a graphical representation of the hierarchical structure of zs items defining an animation. Conventional editing systems provide such a scene tree to allow an artist such as artist 101 to identify and edit parameters of an animation.
The scene tree of Figure 7 illustrates the animations items and relationships shown in the table of Figure 6. Thus, cube 201 is represented by an node 701 which has lead-lines down to nodes 702, 703, 704, 705, 706, 707, 708 and 709 representing respectively the animation and the five textures applied to the cube 201, and the spheres 202 and 203. Similarly, spheres 202 and 203, represented by nodes 708 and 709, have associated attributes represented by nodes 710 and 711, and 712 and 713 respectively. A node 714 is used to represent the set and this has lead-lines down to nodes 715 and 716 representing the texture applied to the set and the light applied to the set.
Lead-lines drawn up from the set 714 and cube 701 to another node 717, representing the target scene, indicate that the target scene is the parent of said set, and cubs. Lead-lines from the node 717 and a node 718 representing a defined camera, "CAMERA1 ", up to scene renderer node 719 show that the scene renderer is the parent of said camera and the target scene.
The graphical animation illustrated in Figure 2 is simple, being composed of just three simple objects, and yet the corresponding scene 20 tree, shown in Figure 7, has nineteen nodes. In animations which are more complex, for example containing many human characters or animals, scene trees have been known to contain hundreds and in some cases thousands of nodes. It will therefore be understood that a user such as artist 101, will have difficulties in navigating around the scene tree in such cases.
25 In contrast, as described below, the system 102 provides a user interface which allows its user to navigate around the objects of a scene and related attributes in such a way that only items closely related to a user selected item are displayed. The user is therefore presented with only information which is relevant to their present interest, and only of a limited volume, thus making it relatively easy to comprehend when compared to 5 the scene tree of Figure 7.
Furthermore, the animation editing system 102 preferably includes character registration mapping as described in the applicants co-pending Canadian patent application published as CA. 2 314 712. It has been found that the character registration mapping in combination with the features of the graphical user interface described herein allows a user to perform animation editing without the need to refer to a scene tree, such as that shown in Figure 7.
Another point to note from Figure 7 is that although the textures applied to the spheres 202 and 203 are the same as two of the textures applied to the cube 201, there is no indication of this in the scene tree.
This can be a problem in more complex anirnations where, for example, attributes such as colours and textures are applied to several objects but, due to the complexity of the scene, it is not apparent to the artist. For example, an artist may edit the parameters of an attribute applied to an object, not realising that they are also editing an attribute of other objects.
Figure 8 A graphical user interface (GUl) 801 produced by the application program and displa~red on the visual display unit 104 is shown in Figure 8.
2s The user interface 801 contains four windows: icon window 802, viewer window 803, navigation window 804 and tool window 805. Viewer window 2oo~-Pros-ca 803 contains a two dimensional representation of the three-dimensional animation which is to be edited. Thus, in the present example, window 803 contains a virtual floor 80fi, apparently above which are arranged the cube 201, and the two spheres 202 and 203. In addition, window 803 also contains a virtual directional light which appears to illuminate the objects 201, 202 and 203. Window 803 ,also presently contains the cursor 808 which may be moved across the whole displa~r by means of the mouse 103.
The icon window 802 contains a number of icons which facilitate the creation of new simulated objects, the addition of materials to objects within the scene, the animation of objects within the scene, etc.
The navigation window 804 displays a number of labels representing selected items defining the animation. The particular items displayed by the navigation window are selected by the application program in response to the user's input. Specifically, when the system receives an input indicating that a simulated object in viewer window 803 has been selected by the user, the system displays a label representing said selected object at the top of the navigation window Fi04, and then displays labels representing other items which are directly related to the selected simulated object.
20 "Directly related" is herein defined as meaning "being a parent of, or being a child of'. Thus if two items are directly related, then one is a child of the other, and when a simulated object is selected by the user, labels representing said object, the child or children of said object and the parent of said object are displayed in the navigation window.
2~ For example, in Figure 8, the user 101 has double clicked the mouse with the cursor 808 over the sphere 203 in window 803, in order to indicate to the system that the sphere 203 is to be represented in the navigation window 804. Therefore, a label 8'.10 representing sphere 203, labels 811 and 812 representing its attributes and another label 813 representing its parent, "CUBE1 ", are displayed in window 804.
The user 101 is therefore presented with only the portion of the hierarchical structure that they are interEated in, rather than being confronted with the whole scene tree.
The application selects suitable tools for editing the selected item and displays said tools within window 805.
~o After selecting a particular simulated object the user may then selected another such object by clicking on the relevant object in viewer window 803. Alternatively, the user may navigate around the hierarchy structure by clicking on labels displayed in the navigation window 804. For example, the user could view the items directly related to the cube 201 by ~5 clicking on the label "CUBE1" 813, or if they wished to edit the texture applied to the sphere 203 they could click on the label 812.
The application program is therefore structured such that if the system receives an input indicating that a label within the navigation window has been selected, it displays the selected label at the top of the 2o navigation window and displays labels of directly related items below it.
An example of this functionality is provided by Figure 9.
Figure 9 The navigation window 804 after user selection of the label for the 25 Whitehouse texture is shown in Figure 9. By clicking on the Whitehouse texture label 812 in Figure 8, the user indicates to the system that the Whitehouse texture is to be selected. Thus, as shown in Flgc~re 9, the Whitehouse label 812 is repositioned at the top of the navigation window 804 to identify it as the selected item, and labels representing items directly related to said texture are displayed below it. Therefore, window 804 displays labels 810 and 813 representing the simulated objects sphere 203 and cube 201, making it clear to the user that the Whitehouse texture is used on both the sphere 203 and the cube 201.
The windows 802 and 803 remain unchanged in appearance, and so continue to display creation tool icons and a view of the animation. While 1o window 805 is updated to display appropriate editing tools If the user now wishes to divert their attention to the cube 201, they may update the navigation window by selecting the relevant label 813 using the mouse.
Figure 90 Figure 70 shows the navigation window 804 after selection of the "CUBE1" label 813. Once again the selected item has been positioned at the top of the window 804 and labels of i:he directly related items are displayed below it. Thus, for example, window 804 displays labels 2o representing the attributes of the selected object, i.e. the cube 201, and labels 1001 and 810 representing the objects 202 and 203 constrained to said selected object.
Figure 77 The processing of data in response to user generated input commands at step 405 of Figure 4 allows many sophisticated animation techniques to be performed. A portion of the procedures performed, implementing the preferred embodiment of the present invention are illustrated by Figures 9' to 74. The proces ses are event driven and will respond to event input data generated by the user. In order to respond to an event, central processing unit 301 responds to interrupts and the animation program, in combination with the operating system, is required to handle these interrupts in an appropriate manner.
Figure 7 7 shows greater detail of the step 405 of editing project data. Within step 405, at step 1101, the sy:>tem receives an input signal generated by a mouse button click. At step 1102, the input signal is interpreted. The input signal provides infarmation relating to the two dimensional co-ordinates of the mouse cursor, and these co-ordinates are used in combination with a look-up table to determine what the co-ordinates correspond to. If the co-ordinates fall within the viewer window 803, it is 15 then determined whether they correspond to a position on the image of a simulated object, and if so, the node identii:y for the simulated object is identified. If the co-ordinates fall within the navigation window 804, it is then determined whether a displayed label has been selected, and if so the node identity corresponding to the node label is identified.
2o Alternatively., if the user input corresponds to the operation of an editing tool or creation tool, project data will be updated at step 1103. At step 1104, the graphical user interface is updated in response to the user input. Then at step 1105, a question is asked to determine whether the end of the editing session has been indicated by the user input, and if so step 25 405 is completed. Otherwise steps 1101 to 1105 are repeated.

Figure 72 The step 1104 of updating the user interface is shown in more detail in Figure 72. At step 1201 it is determined whether the user input indicated the selection of an editing tool or creation tool, or if the project was edited at step 1103, and if so, the display is updated accordingly at step 1202 before step 1203 is performed. Otherwise the process enters step 1203 directly.
At step 1203 a question is asked to determine whether the user input indicated the selection of a simulated three dimensional object, and if so, step 1204 is performed before step 1205. Otherwise step 1205 is performed directly after step 1203. At step 1204 node labels are displayed corresponding to the selected simulated objeca and items directly related to said selected object only. Thus, unlike the prior art illustrated in Figure 7, only labels of selected items are displayed and not the whole scene tree.
At step 1205 a question is asked to determine whether the received user input indicated the selection of a label in the navigation window 304. If this is so, then the selected Iabe1 is displayed in the navigation window 804 along with labels of directly related items only. Completion of step 1206, or 2o a negative answer to the question at step 1205 completes step 1104.
Figure 93 The step 1204 of displaying labels for a selected simulated object and directly related objects is shown in further detail in Figure 73. Within step 1204, at step 1301, using the node identity of the selected simulated object and the database illustrated by example in Figure 6, node identities are retrieved for the parent node and all children nodes for the selected simulated object. Having obtained node identities at step 1301, node labels corresponding to the retrieved parent and child node identities, and for the selected simulated object are retrieved form the database at step 1302. At step 1303 the node label for the selected simulated object is displayed at the top of the navigation window, and the node labels for its parent and children are also displayed in the navigation window during step 1304.
fn an alternative embodiment, as well as displaying labels of the parent and children of the selected simulated object, the parent of the parent is also displayed. In a further alternative embodiment, as well as displaying labels of the parent and children of the selected simulated object, the children of the children are also displayed. However, in the preferred embodiment and these two alternative embodiments, the system only displays a label for the selected simulated objects and related items ~ o within a defined degree of relationship.
Figure'4 The step 1206 of displaying a selected label with directly related items is shown in further detail in Figure 74. At step 1401, node identities 2o are retrieved for the parent nodes and all children nodes for the selected item. This is done using the node identity of 'the selected item, determined at step 1102, and the database illustrated by example in Figure 6. It should be noted that where the selected label corresponds to an attribute, there may be more than one parent item. This was the case in the example of 2~ FigUre 9, where the "Whitehouse" texture had two parent objects, sphere 302 and cube 201.

Having obtained node identities afi step 1401, node labels corresponding to the retrieved parent and child node identities, and for the selected label are retrieved from the database at step 1402. At step 1403 the selected node label is displayed at the 'top of the navigation window 804, and the node labels for the parents and children are also displayed in the navigation window during step 1404.
Figure T 5 A second example of an animation project to be edited by the user 101 is illustrated in Figure 95. The animation comprises a three-dimensional cartoon-like character 1501 who luounces a basketball 1502 as he walks along. The character is wearing baggy trousers 1503 and large basketball shirt 1504 which appear to move naturally as the character moves along.
Figure 76 A conventional scene tree representing the animation of Figure 75 is shown in Figure E6. Although still quite a simple animation, it is clearly more complex that that of Figure 2, and consequently there are thirty-three 2o nodes in the scene tree.
The character comprises a group of simulated objects in the form of an internal skeleton which allows the character to be positioned and animated, and external objects constrained to the skeleton to provide him with a dressed, human-like appearance. Thus the scene tree has a family of nodes, shown within dashed line 7601, which comprise the skeleton of the character, and other nodes, shown within dashed' line 1602, which comprise its outer body.
As can be seen in Figure 96, the skeleton of the character comprises eleven objects referred to as bones. The bones have strict parent-child relationships which determine how the character may be animated using, for instance, forward kinematics, or inverse kinematics.
In this example, the body of the character is formed as a single object and represented by node 1603. The body is the parent of other objects including the shirt 1504 and trousers 1503, represented by nodes 1604 and 1605. The shirt and trousers are constrained to the body, so that their animation is determined by the animation of the body. The shirt and trousers are thus children of the body as illustrated by the scene tree. The shirt, trousers and body each have applied textures as represented by nodes 1606, 1607 and 1608 respectively.
Figure ~ 7 The visual display unit 104, as it appears during editing of the animated character 1501, is shown in Figure 77. The character 1501 is displayed in the viewer window 803 with the mouse cursor 808 positioned over him. The user 101 has double clicked ors the character and therefore 2o the node label "CHARACTER#8" 1701 representing the animated character is displayed at the top of the navigation window 804. Listed below the label 1701 are labels 1702, 1703, 1704 and 1705 representing the two simulated objects the character body and the skeleton root, the animation applied to the character, and target scene respectively.

Figure 78 The navigation window 804 is shown in Figure 98 after the user has double clicked on the label "BODY-BB07" 1702 in Figure 77. Thus, the label 1702 is displayed at the top of the window 804 and it parent and children s are represented by labels displayed below it.
The windows 802 and 803 (not shown in Figure 78) remain unchanged in appearance and so continue to display creation tool icons and a view of the animation. While window 805 is updated to display appropriate editing tools ~ o The body of the character 1501 is one of several simulated objects in a group labelled "CHARACTER#8" which defines the character 1501.
Consequently, °'CHARACTER#8" is the parent of said body and so label 1701 is displayed below label 1702.
The simulated objects which provide the appearance of the 15 trousers, shirt and hair of the character 1501 are constrained to the body and so they are children of the body. Thus labels 1801, 1802 and 1803 representing the trousers, shirt and hair are displayed below label 1702 representing the body.
As shown in the scene tree of Figure 76, there are nine levels and 2o thirty-three nodes in the hierarchy for the anim<~tion of Figure 75.
However, it will now be clear that the present invention limita the displayed information so that it is of a quantity that may be easily understood by the user.
Furthermore the invention selects the information so that it is most relevant to the user's requirements.

Claims (25)

1. Animation editing apparatus for editing animation data, said apparatus comprising data storage means, processing means, visual display means and a manually responsive input device configured to allow a user to indicate a selected point on the visual display means, wherein:
said visual display means is configured to display an image representing a simulated three-dimensional world-space including a plurality of simulated objects;
said manually responsive input device is configured to provide an input signal indicating a location within said image corresponding to one of said simulated objects;
said processing means is configured to identify the selected simulated object in response to receiving said input signal, and to retrieve data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and said visual display means is configured to display labels identifying the selected simulated object and said related items only.
2. Animation editing apparatus for editing animation data according to claim 9, wherein said defined degree of relationship is such that said displayed labels identify said selected object and items directly related to said selected object only.
3. Animation editing apparatus for editing animation data according to claim 1, wherein said displayed related items include the parent item of said selected object.
4. Animation editing apparatus for editing animation data according to claim 1, wherein said displayed related items include attributes of said selected simulated object.
5. Animation editing apparatus for editing animation data according to claim 1, wherein said apparatus is configured such that said display identifies said selected simulated object as said selected simulated object.
6. Animation editing apparatus for editing animation data according to claim 1, wherein on receiving an input signal from said manually responsive input device indicating one of said displayed labels, said processing means is configured to determine the identity of a selected item corresponding to said indicated displayed label, and said apparatus is configured such that said visual display means displays labels identifying the selected item and items directly related to said selected item.
7. Animation editing apparatus for editing animation data according to claim 6, wherein when said selected item is an attribute, said apparatus is configured to display labels on said visual display means identifying the simulated objects to which said attribute is applied.
8. Animation editing apparatus for editing animation data according to claim 1, wherein said apparatus is such that when one of said displayed labels represents an attribute, and an input is received at said manually responsive input device indicating selection of the label representing said attribute, said apparatus is configured to display labels on said visual display means identifying the selected attribute and each simulated object to which said attribute is applied.
9. Animation editing apparatus for editing animation data according to claim 1, wherein said displayed related items include other simulated objects which are directly related to said selected simulated object.
10. Animation editing apparatus for editing animation data according to claim 1, wherein said apparatus is configured such that when said selected simulated object is constrained to another simulated object, said displayed related items include said other simulated object.
11. Animation editing apparatus for editing animation data according to claim 1, wherein said apparatus is configured such that when a constrained simulated object is constrained to said selected simulated object, said displayed related items include said constrained simulated object.
12. A method of editing animation data in a data processing system, said system comprising data storage means, processing means, visual display means and manually responsive input device configured to allow a user to indicate a selected point on the visual display means, comprising the steps of:
displaying an image representing a simulated three-dimensional world-space on said visual display means, said world-space including a plurality of simulated objects;
receiving an input at said manually responsive input device indicating a location within said image corresponding to one of said simulated objects;
in response to receiving said input, identifying the selected simulated object;
retrieving data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and displaying labels on said visual display means identifying the selected simulated object and said related items only.
13. A method of editing animation data in a data processing system according to claim 12, wherein said defined degree of relationship is direct relationship, such that said displayed labels identify said selected object and items directly related to said selected object only.
14. A method of editing animation data in a data processing system according to claim 12, wherein said displayed related items include the parent item of said selected object.
15. A method of editing animation data in a data processing system according to claim 12, wherein said displayed related items include attributes of said selected simulated object.
16. A method of editing animation data in a data processing system according to claim 12, wherein said display identifies said selected simulated object as said selected simulated object.
17. A method of editing animation data in a data processing system according to claim 12, wherein said method comprises the steps of:
receiving an input at said manually responsive input device indicating one of said displayed labels;
determining the identity of a selected item corresponding to said indicated displayed label; and displaying labels on said visual display means identifying the selected item and items directly related to said selected item.
18. A method of editing animation data in a data processing system according to claim 17, wherein when said selected item is an attribute, said method comprises the step of displaying labels on said visual display means identifying the simulated objects to which said attribute is applied.
19. A method of editing animation data in a data processing system according to claim 12, wherein said method is such that when one of said displayed labels represents an attribute, and an input is received at said manually responsive input device indicating selection of the label representing said attribute, said method comprises the step of:

displaying labels on said visual display means identifying the selected attribute and each simulated object to which said attribute is applied.
20. A computer readable medium having computer readable instructions executable by a computer such that, when executing said instructions, a computer will perform the steps of:
display an image representing a simulated three-dimensional world-space on a visual display means, said world-space including a plurality of simulated objects;
receive an input at a manually responsive input device indicating a location within said image corresponding to one of said simulated objects;
in response to receiving said input, identify the selected simulated object;
retrieve data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and display labels on said visual display means to identify the selected simulated object and said related items only.
21. Animation editing apparatus for editing animation data, said apparatus comprising data storage means, processing means, visual display means and a manually responsive input device configured to allow a user to indicate a selected point on the visual display means, wherein:
said visual display means is configured to display an image representing a simulated three-dimensional world-space including a plurality of simulated objects;

said manually responsive input device is configured to provide an input signal indicating a location within said image corresponding to one of said simulated objects;
said processing means is configured to identify the selected simulated object in response to receiving said input signal, and to retrieve data from said data storage means of one or more directly related items directly related to said selected simulated object, and such that said directly related items include the parent item of said selected object; and said visual display means is configured to display labels identifying the selected simulated object and said directly related items only.
22. Animation editing apparatus for editing animation data according to claim 21, wherein on receiving an input signal from said manually responsive input device indicating one of said displayed labels, said processing means is configured to determine the identity of a selected item corresponding to said indicated displayed label, and said apparatus is configured such that said visual display means displays labels identifying the selected item and items directly related to said selected item.
23. Animation editing apparatus for editing animation data according to claim 22, wherein when said selected item is an attribute, said apparatus is configured to display labels on said visual display means identifying the simulated objects to which said attribute is applied.
24. Animation editing apparatus for editing animation data, said apparatus comprising a personal computer having data storage means, a processor, a visual display unit and a manually responsive input device configured to allow a user to indicate a selected point on the visual display unit, wherein:
said visual display unit is configured to display an image representing a simulated three-dimensional world-space including a plurality of simulated objects;
said manually responsive input device is configured to provide an input signal indicating a location within said image corresponding to one of said simulated objects;
said processor is configured to identify the selected simulated object in response to receiving said input signal, and to retrieve data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and said visual display unit is configured to display labels identifying the selected simulated object and said related items only, wherein on receiving a further input signal from said manually responsive input device indicating one of said displayed labels, said processor is configured to determine the identity of a selected item corresponding to said indicated displayed label, and said apparatus is configured such that said visual display unit displays labels identifying the selected item and items directly related to said selected item.
25. Animation editing apparatus for editing animation data according to claim 24, wherein when said selected item is an attribute, said apparatus is configured to display labels on said visual display unit identifying the simulated objects to which said attribute is applied.
CA002415912A 2002-07-19 2003-01-09 Animation editing apparatus Abandoned CA2415912A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0216814.4 2002-07-19
GB0216814A GB2391144A (en) 2002-07-19 2002-07-19 Retrieval of information related to selected displayed object

Publications (1)

Publication Number Publication Date
CA2415912A1 true CA2415912A1 (en) 2004-01-19

Family

ID=9940776

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002415912A Abandoned CA2415912A1 (en) 2002-07-19 2003-01-09 Animation editing apparatus

Country Status (3)

Country Link
US (1) US7692657B2 (en)
CA (1) CA2415912A1 (en)
GB (1) GB2391144A (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8683386B2 (en) 2006-10-03 2014-03-25 Brian Mark Shuster Virtual environment for computer game
TWI322392B (en) * 2006-12-14 2010-03-21 Inst Information Industry Apparatus, method, application program, and computer readable medium thereof capable of pre-storing data for generating self-shadow of a 3d object
JP2013535726A (en) * 2010-07-23 2013-09-12 アルカテル−ルーセント A method for visualizing users in a virtual environment
US8464153B2 (en) 2011-03-01 2013-06-11 Lucasfilm Entertainment Company Ltd. Copying an object in an animation creation application
US9478058B2 (en) * 2012-08-06 2016-10-25 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
CA2889778A1 (en) * 2014-04-28 2015-10-28 Modest Tree Media Inc. Virtual interactive learning environment
WO2016171874A1 (en) * 2015-04-22 2016-10-27 Google Inc. Providing user-interactive graphical timelines
US10210645B2 (en) 2015-06-07 2019-02-19 Apple Inc. Entity agnostic animation tool

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0461577B1 (en) * 1990-06-11 1998-12-02 Hitachi, Ltd. Apparatus for generating object motion path
US5483630A (en) * 1990-07-12 1996-01-09 Hitachi, Ltd. Method and apparatus for representing motion of multiple-jointed object, computer graphic apparatus, and robot controller
JP3179474B2 (en) * 1990-11-28 2001-06-25 株式会社日立製作所 Computer graphic display method and information processing apparatus
US5261041A (en) * 1990-12-28 1993-11-09 Apple Computer, Inc. Computer controlled animation system based on definitional animated objects and methods of manipulating same
US5347306A (en) * 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
US5511158A (en) * 1994-08-04 1996-04-23 Thinking Machines Corporation System and method for creating and evolving directed graphs
US5619632A (en) * 1994-09-14 1997-04-08 Xerox Corporation Displaying node-link structure with region of greater spacings and peripheral branches
US5577185A (en) * 1994-11-10 1996-11-19 Dynamix, Inc. Computerized puzzle gaming method and apparatus
US5546518A (en) * 1995-01-06 1996-08-13 Microsoft Corporation System and method for composing a display frame of multiple layered graphic sprites
US6983227B1 (en) * 1995-01-17 2006-01-03 Intertech Ventures, Ltd. Virtual models of complex systems
US5680619A (en) * 1995-04-03 1997-10-21 Mfactory, Inc. Hierarchical encapsulation of instantiated objects in a multimedia authoring system
US5786814A (en) * 1995-11-03 1998-07-28 Xerox Corporation Computer controlled display system activities using correlated graphical and timeline interfaces for controlling replay of temporal data representing collaborative activities
WO1997035280A2 (en) * 1996-03-15 1997-09-25 Zapa Digital Arts Ltd. System for producing animation sequence according to character behaviour characteristics
US5896139A (en) * 1996-08-01 1999-04-20 Platinum Technology Ip, Inc. System and method for optimizing a scene graph for optimizing rendering performance
US6144962A (en) * 1996-10-15 2000-11-07 Mercury Interactive Corporation Visualization of web sites and hierarchical data structures
US6003040A (en) * 1998-01-23 1999-12-14 Mital; Vijay Apparatus and method for storing, navigating among and adding links between data items in computer databases
US6049805A (en) * 1998-02-24 2000-04-11 Microsoft Corporation Dynamic event mechanism for objects with associational relationships
US6437784B1 (en) * 1998-03-31 2002-08-20 General Mills, Inc. Image producing system for three-dimensional pieces
US6801916B2 (en) * 1998-04-01 2004-10-05 Cyberpulse, L.L.C. Method and system for generation of medical reports from data in a hierarchically-organized database
JP3033956B2 (en) * 1998-07-23 2000-04-17 インターナショナル・ビジネス・マシーンズ・コーポレイション Method for changing display attributes of graphic objects, method for selecting graphic objects, graphic object display control device, storage medium storing program for changing display attributes of graphic objects, and program for controlling selection of graphic objects Storage media
US6373484B1 (en) * 1999-01-21 2002-04-16 International Business Machines Corporation Method and system for presenting data structures graphically
US6714201B1 (en) * 1999-04-14 2004-03-30 3D Open Motion, Llc Apparatuses, methods, computer programming, and propagated signals for modeling motion in computer applications
US6567070B1 (en) * 1999-08-10 2003-05-20 Intel Corporation Selection of objects in a graphical user interface
US7061486B2 (en) * 1999-09-24 2006-06-13 Sun Microsystems, Inc. Using messaging to manage scene-based rendering
JP2001273520A (en) * 2000-03-23 2001-10-05 Famotik Ltd System for integrally displaying multimedia document
GB0009750D0 (en) * 2000-04-19 2000-06-07 Erecruitment Limited Method and apparatus for data object and matching,computer readable storage medium,a program for performing the method,
US20010035873A1 (en) * 2000-04-20 2001-11-01 David Easter Method and system for deferred assignment of attributes in a computer graphics scene
AUPQ966400A0 (en) * 2000-08-24 2000-09-21 Xemplex Pty Ltd Method of graphically defining a formula
JP2002319036A (en) * 2001-02-13 2002-10-31 Sega Corp Animation generation program
JP3747404B2 (en) * 2001-06-19 2006-02-22 インターナショナル・ビジネス・マシーンズ・コーポレーション Graphics image creating apparatus, method and program thereof
US7116338B2 (en) * 2001-09-26 2006-10-03 Canon Kabushiki Kaisha Color information processing apparatus and method
US6747650B2 (en) * 2002-04-22 2004-06-08 Battelle Memorial Institute Animation techniques to visualize data
US7292243B1 (en) * 2002-07-02 2007-11-06 James Burke Layered and vectored graphical user interface to a knowledge and relationship rich data source
US20040006425A1 (en) * 2002-07-03 2004-01-08 Terragraphix, Inc. System for communicating and associating information with a geographic location
US7064761B2 (en) * 2004-05-10 2006-06-20 Pixar Techniques for animating complex scenes

Also Published As

Publication number Publication date
US20040012640A1 (en) 2004-01-22
US7692657B2 (en) 2010-04-06
GB0216814D0 (en) 2002-08-28
GB2391144A (en) 2004-01-28

Similar Documents

Publication Publication Date Title
US8493380B2 (en) Method and system for constructing virtual space
US8495066B2 (en) Photo-based virtual world creation system for non-professional volunteers
Bowman et al. The virtual venue: User-computer interaction in information-rich virtual environments
EP0435601B1 (en) Display of hierarchical three-dimensional structures
Murdock 3ds Max 2012 bible
JPH04229380A (en) Visual recognizing method and apparatus for numerical-value data
Hauser et al. Two-level volume rendering-fusing MIP and DVR
Wang et al. A desktop VR prototype for industrial training applications
Kouřil et al. Labels on levels: Labeling of multi-scale multi-instance and crowded 3D biological environments
US5982388A (en) Image presentation device with user-inputted attribute changing procedures
US7692657B2 (en) Animation editing apparatus
US11238657B2 (en) Augmented video prototyping
US20040012641A1 (en) Performing default processes to produce three-dimensional data
US11625900B2 (en) Broker for instancing
Nan et al. vDesign: a CAVE-based virtual design environment using hand interactions
Glueck et al. Considering multiscale scenes to elucidate problems encumbering three-dimensional intellection and navigation
Ahlers et al. Augmented vision system for industrial applications
Neves et al. Virtual environments and GIS
Eyl The harmony information landscape: interactive, three-dimensional navigation through an information space
CN105869218B (en) The neoplastic lesion edit methods and device of blood vessel mathematical model
Pichler Interactive browsing of 3D scenes in hypermedia: The Hyper-G 3D viewer
Hausner et al. Making geometry visible: An introduction to the animation of geometric algorithms
Ebert et al. Two-handed volumetric document corpus management
Maulana et al. Utilizing Game Engine for Development Interactive 3-Dimensional Geographic Information System (GIS) Agriculture Commodity Selection and Land Evaluation
Kundert et al. MASTERING MAYA 8.5 (With CD)

Legal Events

Date Code Title Description
FZDE Discontinued