US20080184143A1 - Methods for Identifying Actions in a Flowchart - Google Patents
Methods for Identifying Actions in a Flowchart Download PDFInfo
- Publication number
- US20080184143A1 US20080184143A1 US11/957,076 US95707607A US2008184143A1 US 20080184143 A1 US20080184143 A1 US 20080184143A1 US 95707607 A US95707607 A US 95707607A US 2008184143 A1 US2008184143 A1 US 2008184143A1
- Authority
- US
- United States
- Prior art keywords
- action
- actions
- cell
- timing
- primary
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
Definitions
- a multimedia experience refers to the use of media in an interactive environment.
- the media generally include one or more types of information content, including for example, text, audio, graphics, animation, and video.
- the media are presented to an end-user according to a logical sequence that can be affected by the end-user's actions.
- a multimedia experience can be modeled as a flowchart that defines the logical sequence for playback of the multimedia experience.
- a flowchart generally consists of a sequence of linked cells that directly or indirectly reference media assets to be played by an application in a predefined order. The selection of which media referenced in the cells will be played can depend, in part, upon the choices made by end-users during each playback session of the multimedia experience.
- Flowcharts can be prepared by hand using pencil and paper or can be prepared electronically using a computer.
- the media referenced by several cells are presented to the end user in a coordinated manner, either simultaneously or in short succession of one another.
- one cell is considered to represent playback of primary media off of which related cells representing secondary media are timed.
- display the secondary cells on a side branch connecting these cells to the primary cell.
- the side branch may be visually distinct from other branches in the flowchart. This approach is presented in application Ser. No. 10/038,527 filed Jan. 2, 2002.
- the present invention provides tools, methods and systems for identifying coordinated actions in a flowchart representing a multimedia experience.
- the tools, methods and systems include displaying a graphical view of a flowchart representing a multimedia experience and defining, within the flowchart, a cell including a primary action configured to trigger playback of a primary media asset and secondary actions configured to trigger playback of secondary media assets, wherein each of the secondary actions has a timing coordinated with the primary action.
- the tools, methods and systems further include displaying a representation of the primary action, displaying representations of the secondary actions in chronological order according to their timing; and visually associating each of the representations of the secondary actions with the representation of the primary action.
- FIG. 1 is an illustration of an embodiment in which actions of a media cell are identified in the cell
- FIG. 2 is an illustration of an embodiment in which a media cell is selected to modify its actions
- FIG. 3 is an illustration of an embodiment in which an action of the media cell of FIG. 2 is edited in the cell.
- FIG. 4 is an illustration of an embodiment in which the actions of the media cell of FIG. 2 are sorted in chronological order of their timing
- FIG. 5 is an illustration of an embodiment in which actions of a media cell are displayed and edited in a separate display region.
- the preferred embodiments described herein are preferably implemented using software and/or hardware components.
- the preferred embodiments can be implemented with a software application (i.e., computer-readable program code) running on a processor of a general-purpose computer.
- a software application i.e., computer-readable program code
- some or all of the functionality of the application can be implemented with application-specific hardware components.
- the term “application” shall be used herein to refer generally to the entity (be it software and/or hardware) used to implement the preferred embodiments described below.
- tools shall be used interchangeably with the term “application.”
- FIG. 1 is an illustration of a display output of an application of a preferred embodiment.
- the application displays two display regions 100 , 200 .
- display region refers to an area of display on one or more display devices (e.g., computer monitors).
- Each display region 100 , 200 can be a separate window, or the display regions 100 , 200 can be different areas in a single window.
- the first and second display regions 100 , 200 can be fixed or movable and can be non-overlapping (as in FIG. 1 ) or can overlap each other. Additionally, the first and second display regions 100 , 200 can be alternately shown (one then the other) in response to a command from the application and/or in response to a command from the user.
- the first display region 100 will sometimes be referred to herein as the “Structure Window,” and the second display region 200 will sometimes be referred to herein as the “Editor Pane.”
- the application displays a flowchart 150 representing a sample interactive multimedia experience in the first display region 100 .
- the term “flowchart” refers to any graphical representation of a branching structure.
- the flowchart 150 consists of a sequence of linked cells 1 - 8 .
- the cells represent media content and user interactivity and the links connecting the cells represent the flow of the multimedia experience through time.
- the first display region 100 is referred to as the “Structure Window” because it displays the graphical representation (i.e., the cells and the links connecting the cells) for the flowchart 150 .
- the flowchart 150 is displayed in its entirety in the first display region 100 .
- a flowchart is displayed in the first display region 100 even if only part of the flowchart is visible at one time in the first display region 100 .
- the flowchart were larger than the first display region 100 , only a portion of the flowchart would be visible in the first display region 100 .
- a scroll bar and/or other navigation devices can be provided in the first display region 100 to allow a user to select which part of the flowchart is visible.
- the flowchart 150 in FIG. 1 comprises a plurality of cells 1 - 8 .
- the cells 1 - 8 are illustrated as graphical symbols, some of which are rectangular-shaped and others of which are diamond-shaped. It will be understood that the graphical symbols described herein are intended as illustrative rather than limiting and that, according to the present invention, other types of displays can be used to illustrate the cells in a flowchart.
- the rectangular-shaped cells are referred to herein as “simple cells,” and the diamond-shaped cells are referred to herein as “conditional cells.”
- a simple cell is a cell that contains a single branching link to a single cell. For example, in the flowchart 150 shown in FIG.
- simple cell 4 branches to simple cell 5 and simple cell 5 branches to conditional cell 6 .
- a conditional cell contains multiple branching links to multiple cells.
- the branching links are associated with conditions necessary for that branching link to be followed. For example, in the flowchart 150 shown in FIG. 1 , conditional cell 6 branches to simple cells 7 and 8 .
- the branch to simple cell 7 is followed if the condition “ready to order” is true, while the branch to simple cell 8 is followed if the condition is false.
- a cell can contain two or more individual cells (simple cells or conditional cells). Such a cell is referred to herein as a “group cell” and can be used to simplify the display of a flowchart by grouping multiple cells together in a single cell.
- a group cell can contain other group cells (i.e., groups cells can be nested) and other types of cells, such as “go to” cells, “alias” cells, and other types of cells described in the next paragraph. “Go to” cells can be used instead of a line to show the flow from that cell to another cell. In this way, “go to” cells can be used to keep a flowchart clean and readable. “Alias” cells can be used to mimic the function/operation of another cell.
- Different cells can have different timing. For example, some cells can be played after a previous cell initiates its function, while other cells can be played after a previous cell finishes its function. Other timing mechanisms for playing cells can be used, for example, a timing mechanism based on a common clock.
- the cells in a flowchart can contain any suitable content.
- a cell can contain text.
- a cell can also provide selection choices to a user and evaluate which choice was selected. For example, when conditional cell 6 in FIG. 1 is played, the user is prompted to input “yes” or “no” using a user interface device, such as a keyboard, mouse, microphone, remote control, or any other type of device.
- Conditional cell 6 also determines whether the user input is “yes” or “no.” If the user input is “yes,” the branch leading to cell 7 is followed, and the application runs the script “Great. Let's proceed.”
- a cell can also contain actions such as instructions or references to trigger playback of media, gather user input, generate visual or audio information, send/receive information or media to a database, process data, perform a calculation, or perform other functions, such as describing how and when media should be played.
- media assets include, but are not limited to, digital audio/video objects representing still art, animation, graphics, on-screen text, music, sound effects, or voiceover dialogue. It will be understood that any type of media playback from within a cell includes an action for executing the media playback associated with that cell.
- media cell shall be used herein to refer generally to a cell that can reference at least one media asset or represent a reference to the at least one media asset to be played by the application.
- a media cell acts as a container for actions, or references to actions, that trigger media playback.
- a media cell can contain multiple actions for executing media playback that are implemented by the application (or by another application) in a coordinated manner when the cell is played.
- each media cell includes at least one primary action and possibly one or more secondary action whose timing is coordinated with the primary action.
- the primary action preferably triggers playback of media that conveys a comprehensive message at a particular moment of the experience, such as audio or video of an actor speaking.
- a secondary action preferably triggers playback media that illustrates, reinforces and/or supports the comprehensive message being delivered through the primary action, such as text displaying all or a subset of the script performed by the actor and/or graphics that help illustrate the message of the actor's script.
- the timing of a secondary action of a media cell is based on a time X before or after the beginning or end of the primary action of that media cell. Different actions in a cell can trigger playback of media that execute simultaneously with each other. It will be understood that the timing of a secondary action can be coordinated with the timing of a primary action using other timing mechanisms. For example, the primary and secondary actions could be timed off of a common clock, the secondary action could be timed off of another secondary action that is timed off of the primary action, etc. Thus, each media cell contains one or more actions whose timing is coordinated.
- the application distinguishes between media cells generally representing application flow and the coordinated actions performed by a cell when it is played.
- the media cell is a compound cell where several discrete and timed events can occur when flow reaches the cell during playback.
- media cells may function as placeholders within the overall progression of the application at runtime.
- Coordinated actions such as those corresponding to the presentation or modification of media content during runtime flow, can be modeled internally in a media cell as separate objects themselves.
- the cells 1 - 8 in the flowchart 150 of FIG. 1 provide an interactive multimedia experience for a user.
- playing the flowchart means either that the application itself is executing the flowchart or that another application is executing the flowchart.
- the flowchart application could export its logic and data, including its media assets, to a runtime engine that plays back the multimedia experience using the exported logic and data.
- the interactive multimedia experience consists of spoken or verbal scripts accompanied by on-screen text and graphics, and proceeds according to the logical sequence defined by flowchart 150 that can be affected by the end-user's actions.
- the cells 1 - 8 represent distinct moments within the overall progression of the multimedia experience and are arranged in the flowchart 150 in the order in which they are traversed in operation.
- the application and/or runtime engine will generate a signal to invoke the cell's primary action, for example, for executing playback of a media asset referenced by the cell.
- the application and/or runtime engine will also generate signals to invoke any of the cell's secondary actions based on timing coordinated with the timing of the primary action.
- the user when the flowchart is played from its beginning, the user first hears an audio file voicing the greeting “top of the morning” triggered by the primary action 20 in the media cell 2 . The user then hears a sound effects triggered by secondary actions 22 and 24 in the media cell 2 timed off of the primary action.
- a “leaves falling” sound effect corresponding to the secondary action 22 begins to play 1.50 seconds before the end of playback of the phrase “top of the morning.”
- a “birds chirping” sound effect the secondary action 24 begins to play 0.50 seconds before the end of playback of the phrase “top of the morning.”
- the “birds chirping” sound effect would be presented during the experience 1.00 second after the “leaves falling sound effect has started.
- an audio file voicing the phrase “walking down the main path” is played at cell 3 .
- the user then hears the greeting “Welcome” at cell 4 , followed by the question “Are you ready to order?” at cell 5 .
- an audio file voicing the phrase “Great. Let's proceed.” is played at cell 7 or an audio file voicing the phrase “No problem. Take your time.” is played at cell 8 .
- the interactive conversation interface can be used to communicate ideas and information in an audio and/or visual environment, such as interactive computer games, commercials, guided tours, auctions, stories, and news, health, or financial services.
- the interface can be implemented with wired or wireless equipment that includes both audio and video output devices (such as a home computer or television) as well as with equipment that includes only an audio output device (such as a conventional telephone) or only a video output device. It is important to note that these preferred embodiments can be used in flowchart applications that are unrelated to an interactive multimedia experience. Accordingly, a specific type of cell should not be read into the claims unless explicitly recited therein.
- the secondary actions 22 and 24 for the media cell 2 are identified by being displayed within the media cell 2 in a separate area, for example a column, next to the primary action 20 , as shown in FIG. 1 .
- the secondary actions 22 and 24 are organized or stacked within the media cell 2 according to their chronological order of execution. Therefore, the media for the secondary actions 22 and 24 within the media cell 2 would begin execution in the same order in which the secondary actions 22 and 24 are displayed.
- the media cell also includes a minimize and restore button 26 that is clickable to selectively hide the display of the secondary actions 22 and 24 , for example, by alternatively minimizing and restoring the display of the column having the secondary actions 22 and 24 therein.
- the timing values for the secondary actions 22 and 24 are stored in the secondary actions and are displayed directly beneath the descriptions of the secondary actions in the media cell 2 .
- the timing values for the secondary actions 22 and 24 of the media cell 2 are based on a time before the end of the primary action 20 .
- the timing value for the secondary action 22 is E ⁇ 1.50 seconds, which indicates that the media associated with this secondary action (“leaves falling”) would begin playing 1.50 before the end of playback of the primary action 20 (“top of the morning”).
- the timing value of E ⁇ 0.50 seconds for the secondary action 24 means that the media associated with this secondary action (“birds chirping”) would begin to play 0.50 seconds before the end of playback of the primary action 20 (“top of the morning”) or 1.00 seconds after the secondary action 22 begins.
- the timing value can be represented, for example, as E+X; if the timing of the secondary action is based on a time X before the start of the primary action, the timing value can be represented, for example, as S ⁇ X; and if the timing of a secondary action is based on a time X after the start of a primary action, the timing value can be represented, for example, as S+X.
- the display of related secondary actions 22 and 24 within the media cell 2 in this way correctly reflects their child-parent relationship to the primary action 20 and their membership in the same group, represented by the media cell 2 .
- the display of related secondary actions 22 and 24 within the media cell 2 is consistent with the flowchart metaphor used by the application. In particular, from a writer's standpoint, it may be desirable to be able to visualize the linear flow through the flowchart as if it were a normal sequential storyline in the experience.
- a flowchart may contain many different branching structures. A single path represents application flow along one of many possible paths in the flowchart during an actual runtime session.
- a secondary action in media cell is selected, for example secondary action 22 in the media cell 2 of FIG. 1 , the secondary action is highlighted within the media cell, while the single path continues beyond the media cell along the path that contains it. This is consistent with the notion that playback in the flowchart should proceed sequentially from one cell to the next.
- the application may implement different methods to determine a single path.
- the selection of a single cell can be used to isolate the cells above and below the selected cell to form a single path based on the history of the cell and the history of the cells above and below it in succession.
- a path can be determined based on the most-frequently selected path containing the selected cell or the application can choose the path that most recently contained the selected cell.
- the application can randomly determine a path comprising the selected cell or can semi-randomly determine a path comprising the selected cell, such as when part of the path is selected based upon some form of logic (e.g., most frequently selected links three cells above and below the selected cell), but the rest of the path is selected at random (e.g., all other links are selected randomly).
- the application may also implement different techniques to identify the cells in a single path. For example, cells along a single path may be identified by displaying the lines linking the cells differently (e.g., in a different color, shading, or thickness) than the lines linking cells that are not along the path. Other techniques can be used to identify cells along a path, for example, the borders of the cells along a path and their branches may be displayed with thicker lines in the first display region 100 .
- the second display region 200 contains a textual view 250 of the cells 1 - 7 in the order in which the cells appear in the flowchart 150 .
- the textual view generally shows a “description” of the primary action of a cell (e.g., the text that is contained in the primary action, the line of dialogue that will be voiced by the primary action, the description of the animation or of the SFX that will be played by the primary action, etc.).
- the textual view also may include “descriptions” of the secondary actions of the cell.
- the content displayed in the textual view 250 can also contain other cell “properties,” such as the font of displayed text, the name of the actor who will provide the media content, the cell's timing relative to other cells, the volume at which audio content of the cell should be played, the frames per second of animation, and other information.
- the “textual” view can also contain graphics.
- the text of the cell can also be supported with a graphics button or thumbnail image next to the text that indicates the type of media in the cell (e.g., a button with a little horn indicating a SFX) and is clickable to open a properties window of the cell, to playback the media, or to perform other functions.
- editing of the secondary actions 22 and 24 can be performed in-place within each media cell.
- a user individually selects a cell in the flowchart.
- a user uses a pointing device (such as a mouse or trackball) to move a pointer 30 over the media cell 2 , and then selects that cell by pressing a selector switch (such as the mouse button).
- a selector switch such as the mouse button
- the user can select the media cell 2 using any other type of user interface device. For example, if the cells are numbered, the user can select the media cell 2 by typing in the cell number using a keyboard or by speaking the number of the cell into a microphone.
- the application can automatically select the media cell 2 (e.g., based on the output of some operation being run by the application or based on the output sent to the application by another application).
- a user can then select one of the secondary actions 20 and 24 shown in the media cell 2 using the pointing device or another user interface device as described above in order to edit their content and/or properties, including script description, timing properties, and visual display properties.
- a user selects the secondary action 22 and changes its content from “leaves falling” to “rain drops keep falling” in-place within the media cell 2 .
- the user changes the timing value for this secondary action from E ⁇ 1.50 seconds to E ⁇ 0.20 seconds.
- FIG. 5 illustrates a media cell 9 , including a primary action 90 and secondary actions 92 to 96 , and a display region 300 or “properties window” that displays and allows a user to edit the content and/or properties of the actions 90 to 96 .
- the display region 300 displays different properties 94 a and 94 b of the secondary action 94 selected by the user.
- the display region 300 receives user input, and the application can apply the input received to the secondary action 94 in the media cell 9 .
- the application automatically rearranges secondary actions within a media cell to reflect their chronological order of execution.
- the timing of the secondary action 22 (“leaves falling”) is changed from E ⁇ 1.50 seconds to E ⁇ 0.20 seconds
- the order of execution of the secondary action 22 and 24 changes because the secondary action 22 would now be executed 0.30 seconds after the secondary action 24 (“birds chirping”) begins.
- the application automatically shifts the secondary action 22 to fall below the secondary action 24 (“birds chirping”). Therefore, if the timing of a secondary action is updated, so is its order in the list of secondary actions. This is consistent with the notion that playback in the flowchart should proceed sequentially from one action to the next.
- the primary action 20 and the secondary actions 22 and 24 are displayed in separate areas within the media cell 2
- the application may implement other methods to identify and/or to differentiate the secondary actions of a media cell from the primary action of the cell.
- the secondary actions 22 and 24 can be displayed in the first display region 100 adjacent or proximate the media cell 2 according to their chronological order of execution. The positioning of the secondary actions 22 and 24 relative to the media cell 2 would identify their child-parent relationship with the primary action 20 of the media cell 2 .
- Visual cues can also be provided, for example, links connecting each the secondary actions 22 and 24 to the primary action 20 , so as to visually reinforce the child-parent relationship between the secondary action 22 and 24 and the primary action 20 and their membership in the same group.
- the textual view 250 in the second display region 200 can show both a description of the primary action 20 and the descriptions of the secondary action 22 and 24 in the order of their execution.
- the textual view 250 can display the primary action 20 in a first column and the secondary action 22 and 24 next to the primary action in a second column.
- the secondary actions 22 and 24 of the media cell 2 can be displayed in a fourth display region separate from the first and second display regions 100 and 200 , the fourth display region being associated and/or visually linked with the media cell 2 to reinforce the child-parent relationship between the secondary action 22 and 24 and the primary action 20 and their membership in the same group.
- the application can be equipped with various functionality to allow it to facilitate the construction of the media assets scripted by the writer and for providing the programming necessary to fully render the interactive multimedia experience on a given platform.
- cells can contain actions to be implemented, including instructions to play a media asset such as an audio file or a video file.
- the application can sort the various pieces of desired media based on the talent that is necessary to create the media or on other criteria for sorting. For example, the actions in a media cell can be divided into music tasks, animation tasks, art tasks, programming tasks, SFX tasks, writing tasks, video tasks, and performance tasks.
- artists used to create the media can be assigned a list of tasks that need to be performed.
- the application can assemble the completed media for playback.
- the slots in the database can be created before or after the media assets are created. Because an interactive multimedia experience can have thousands of assets and a non-linear structure, it is preferred that the application maintain a database to track the media assets.
Abstract
Tools, methods and systems for identifying coordinated actions in a flowchart representing a multimedia experience. These include displaying a graphical view of a flowchart representing a multimedia experience and defining, within the flowchart, a cell including a primary action configured to trigger playback of a primary media asset and secondary actions configured to trigger playback of secondary media assets, wherein each of the secondary actions has a timing coordinated with the primary action. The tools, methods and systems further include displaying a representation of the primary action, displaying representations of the secondary actions in chronological order according to their timing, and visually associating each of the representations of the secondary actions with the representation of the primary action.
Description
- The present patent document claims the benefit of the filing date under 35 U.S.C. § 119(e) of Provisional U.S. Patent Application Ser. No. 60/875,179, filed Dec. 14, 2006, which is hereby incorporated by reference.
- This patent is related to co-pending, Provisional U.S. Patent Application Ser. No. 60/875,071, titled, “System and Method for Controlling Actions Within a Programming Environment,” filed Dec. 14, 2006; and U.S. patent application Ser. No. 10/038,527, titled, “Method for Identifying Cells in a Path in a Flowchart and for Synchronizing Graphical and Textual Views of a Flowchart,” filed on Jan. 2, 2002. The entire contents of these related patent applications are incorporated herein by reference for all purposes.
- A multimedia experience refers to the use of media in an interactive environment. The media generally include one or more types of information content, including for example, text, audio, graphics, animation, and video. During a multimedia experience, the media are presented to an end-user according to a logical sequence that can be affected by the end-user's actions.
- A multimedia experience can be modeled as a flowchart that defines the logical sequence for playback of the multimedia experience. Such a flowchart generally consists of a sequence of linked cells that directly or indirectly reference media assets to be played by an application in a predefined order. The selection of which media referenced in the cells will be played can depend, in part, upon the choices made by end-users during each playback session of the multimedia experience. Flowcharts can be prepared by hand using pencil and paper or can be prepared electronically using a computer. Some software applications require a user to build a flowchart by drawing graphical shapes and then typing text into each graphical shape.
- Often, during a playback session, the media referenced by several cells are presented to the end user in a coordinated manner, either simultaneously or in short succession of one another. In these situations, one cell is considered to represent playback of primary media off of which related cells representing secondary media are timed. In order to illustrate the dependencies between coordinated media, it is known to display the secondary cells on a side branch connecting these cells to the primary cell. The side branch may be visually distinct from other branches in the flowchart. This approach is presented in application Ser. No. 10/038,527 filed Jan. 2, 2002.
- It is often desirable to determine the linear flow through a single path in the flowchart. However, if there are many branching structures and secondary cells in the flowchart, it can be difficult to identify the primary cells forming a single path through the flowchart. This can happen, for example, if a writer is creating a flowchart to structurally represent a multimedia experience since a reasonably sophisticated experience can generate a flowchart that is quite large and unwieldy, with hundreds or thousands of cells and complex branching between the cells. From the writer's standpoint, it is desirable to be able to visualize the linear flow through the flowchart as if it were a normal sequential storyline in the experience by viewing the graphical view of the flowchart.
- There is a need, therefore, for a method that can be used to overcome the disadvantages discussed above.
- The present invention provides tools, methods and systems for identifying coordinated actions in a flowchart representing a multimedia experience. The tools, methods and systems include displaying a graphical view of a flowchart representing a multimedia experience and defining, within the flowchart, a cell including a primary action configured to trigger playback of a primary media asset and secondary actions configured to trigger playback of secondary media assets, wherein each of the secondary actions has a timing coordinated with the primary action. The tools, methods and systems further include displaying a representation of the primary action, displaying representations of the secondary actions in chronological order according to their timing; and visually associating each of the representations of the secondary actions with the representation of the primary action.
- Additional features and advantages of the present invention are described in, and will be apparent from, the following Detailed Description and the figures.
-
FIG. 1 is an illustration of an embodiment in which actions of a media cell are identified in the cell; -
FIG. 2 is an illustration of an embodiment in which a media cell is selected to modify its actions; -
FIG. 3 is an illustration of an embodiment in which an action of the media cell ofFIG. 2 is edited in the cell. -
FIG. 4 is an illustration of an embodiment in which the actions of the media cell ofFIG. 2 are sorted in chronological order of their timing; and -
FIG. 5 is an illustration of an embodiment in which actions of a media cell are displayed and edited in a separate display region. - The preferred embodiments described herein are preferably implemented using software and/or hardware components. For example, the preferred embodiments can be implemented with a software application (i.e., computer-readable program code) running on a processor of a general-purpose computer. Alternatively, some or all of the functionality of the application can be implemented with application-specific hardware components. For simplicity, the term “application” shall be used herein to refer generally to the entity (be it software and/or hardware) used to implement the preferred embodiments described below. The term “tool” shall be used interchangeably with the term “application.”
- Turning now to the drawings,
FIG. 1 is an illustration of a display output of an application of a preferred embodiment. Here, the application displays twodisplay regions display region display regions second display regions FIG. 1 ) or can overlap each other. Additionally, the first andsecond display regions first display region 100 will sometimes be referred to herein as the “Structure Window,” and thesecond display region 200 will sometimes be referred to herein as the “Editor Pane.” - As shown in
FIG. 1 , the application displays aflowchart 150 representing a sample interactive multimedia experience in thefirst display region 100. As used herein, the term “flowchart” refers to any graphical representation of a branching structure. In this embodiment, theflowchart 150 consists of a sequence of linked cells 1-8. As described in greater detail further below, the cells represent media content and user interactivity and the links connecting the cells represent the flow of the multimedia experience through time. Thefirst display region 100 is referred to as the “Structure Window” because it displays the graphical representation (i.e., the cells and the links connecting the cells) for theflowchart 150. InFIG. 1 , theflowchart 150 is displayed in its entirety in thefirst display region 100. It should be noted that a flowchart is displayed in thefirst display region 100 even if only part of the flowchart is visible at one time in thefirst display region 100. For example, if the flowchart were larger than thefirst display region 100, only a portion of the flowchart would be visible in thefirst display region 100. In this situation, a scroll bar and/or other navigation devices can be provided in thefirst display region 100 to allow a user to select which part of the flowchart is visible. - The
flowchart 150 inFIG. 1 comprises a plurality of cells 1-8. In this embodiment, the cells 1-8 are illustrated as graphical symbols, some of which are rectangular-shaped and others of which are diamond-shaped. It will be understood that the graphical symbols described herein are intended as illustrative rather than limiting and that, according to the present invention, other types of displays can be used to illustrate the cells in a flowchart. The rectangular-shaped cells are referred to herein as “simple cells,” and the diamond-shaped cells are referred to herein as “conditional cells.” A simple cell is a cell that contains a single branching link to a single cell. For example, in theflowchart 150 shown inFIG. 1 ,simple cell 4 branches tosimple cell 5 andsimple cell 5 branches toconditional cell 6. Unlike a simple cell, a conditional cell contains multiple branching links to multiple cells. The branching links are associated with conditions necessary for that branching link to be followed. For example, in theflowchart 150 shown inFIG. 1 ,conditional cell 6 branches tosimple cells simple cell 7 is followed if the condition “ready to order” is true, while the branch tosimple cell 8 is followed if the condition is false. - It should be noted that a cell can contain two or more individual cells (simple cells or conditional cells). Such a cell is referred to herein as a “group cell” and can be used to simplify the display of a flowchart by grouping multiple cells together in a single cell. In addition to simple and conditional cells, a group cell can contain other group cells (i.e., groups cells can be nested) and other types of cells, such as “go to” cells, “alias” cells, and other types of cells described in the next paragraph. “Go to” cells can be used instead of a line to show the flow from that cell to another cell. In this way, “go to” cells can be used to keep a flowchart clean and readable. “Alias” cells can be used to mimic the function/operation of another cell.
- Different cells can have different timing. For example, some cells can be played after a previous cell initiates its function, while other cells can be played after a previous cell finishes its function. Other timing mechanisms for playing cells can be used, for example, a timing mechanism based on a common clock.
- The cells in a flowchart can contain any suitable content. For example, as in a traditional flowchart drawn on paper, a cell can contain text. A cell can also provide selection choices to a user and evaluate which choice was selected. For example, when
conditional cell 6 inFIG. 1 is played, the user is prompted to input “yes” or “no” using a user interface device, such as a keyboard, mouse, microphone, remote control, or any other type of device.Conditional cell 6 also determines whether the user input is “yes” or “no.” If the user input is “yes,” the branch leading tocell 7 is followed, and the application runs the script “Great. Let's proceed.” - According to the present embodiment, a cell can also contain actions such as instructions or references to trigger playback of media, gather user input, generate visual or audio information, send/receive information or media to a database, process data, perform a calculation, or perform other functions, such as describing how and when media should be played. Examples of media assets include, but are not limited to, digital audio/video objects representing still art, animation, graphics, on-screen text, music, sound effects, or voiceover dialogue. It will be understood that any type of media playback from within a cell includes an action for executing the media playback associated with that cell.
- The term “media cell” shall be used herein to refer generally to a cell that can reference at least one media asset or represent a reference to the at least one media asset to be played by the application. Thus, conceptually, a media cell acts as a container for actions, or references to actions, that trigger media playback. A media cell can contain multiple actions for executing media playback that are implemented by the application (or by another application) in a coordinated manner when the cell is played. In particular, each media cell includes at least one primary action and possibly one or more secondary action whose timing is coordinated with the primary action. The primary action preferably triggers playback of media that conveys a comprehensive message at a particular moment of the experience, such as audio or video of an actor speaking. A secondary action preferably triggers playback media that illustrates, reinforces and/or supports the comprehensive message being delivered through the primary action, such as text displaying all or a subset of the script performed by the actor and/or graphics that help illustrate the message of the actor's script. Preferably, the timing of a secondary action of a media cell is based on a time X before or after the beginning or end of the primary action of that media cell. Different actions in a cell can trigger playback of media that execute simultaneously with each other. It will be understood that the timing of a secondary action can be coordinated with the timing of a primary action using other timing mechanisms. For example, the primary and secondary actions could be timed off of a common clock, the secondary action could be timed off of another secondary action that is timed off of the primary action, etc. Thus, each media cell contains one or more actions whose timing is coordinated.
- In this way, the application distinguishes between media cells generally representing application flow and the coordinated actions performed by a cell when it is played. In particular, the media cell is a compound cell where several discrete and timed events can occur when flow reaches the cell during playback. Thus, media cells may function as placeholders within the overall progression of the application at runtime. Coordinated actions, such as those corresponding to the presentation or modification of media content during runtime flow, can be modeled internally in a media cell as separate objects themselves.
- When the
flowchart 150 is played during a playback session, the cells 1-8 in theflowchart 150 ofFIG. 1 provide an interactive multimedia experience for a user. It will be understood that playing the flowchart means either that the application itself is executing the flowchart or that another application is executing the flowchart. For example, the flowchart application could export its logic and data, including its media assets, to a runtime engine that plays back the multimedia experience using the exported logic and data. The interactive multimedia experience consists of spoken or verbal scripts accompanied by on-screen text and graphics, and proceeds according to the logical sequence defined byflowchart 150 that can be affected by the end-user's actions. The cells 1-8 represent distinct moments within the overall progression of the multimedia experience and are arranged in theflowchart 150 in the order in which they are traversed in operation. When playback reaches a particular cell, the application and/or runtime engine will generate a signal to invoke the cell's primary action, for example, for executing playback of a media asset referenced by the cell. The application and/or runtime engine will also generate signals to invoke any of the cell's secondary actions based on timing coordinated with the timing of the primary action. - In particular, when the flowchart is played from its beginning, the user first hears an audio file voicing the greeting “top of the morning” triggered by the
primary action 20 in themedia cell 2. The user then hears a sound effects triggered bysecondary actions media cell 2 timed off of the primary action. First, a “leaves falling” sound effect corresponding to thesecondary action 22 begins to play 1.50 seconds before the end of playback of the phrase “top of the morning.” Then, a “birds chirping” sound effect thesecondary action 24 begins to play 0.50 seconds before the end of playback of the phrase “top of the morning.” Thus, the “birds chirping” sound effect would be presented during the experience 1.00 second after the “leaves falling sound effect has started. Next, an audio file voicing the phrase “walking down the main path” is played atcell 3. The user then hears the greeting “Welcome” atcell 4, followed by the question “Are you ready to order?” atcell 5. Depending on the user's response, an audio file voicing the phrase “Great. Let's proceed.” is played atcell 7 or an audio file voicing the phrase “No problem. Take your time.” is played atcell 8. - The interactive conversation interface can be used to communicate ideas and information in an audio and/or visual environment, such as interactive computer games, commercials, guided tours, auctions, stories, and news, health, or financial services. The interface can be implemented with wired or wireless equipment that includes both audio and video output devices (such as a home computer or television) as well as with equipment that includes only an audio output device (such as a conventional telephone) or only a video output device. It is important to note that these preferred embodiments can be used in flowchart applications that are unrelated to an interactive multimedia experience. Accordingly, a specific type of cell should not be read into the claims unless explicitly recited therein.
- In accordance with one embodiment of the present invention, the
secondary actions media cell 2 are identified by being displayed within themedia cell 2 in a separate area, for example a column, next to theprimary action 20, as shown inFIG. 1 . Preferably, thesecondary actions media cell 2 according to their chronological order of execution. Therefore, the media for thesecondary actions media cell 2 would begin execution in the same order in which thesecondary actions button 26 that is clickable to selectively hide the display of thesecondary actions secondary actions - The timing values for the
secondary actions media cell 2. In this example, the timing values for thesecondary actions media cell 2 are based on a time before the end of theprimary action 20. Thus, the timing value for thesecondary action 22 is E−1.50 seconds, which indicates that the media associated with this secondary action (“leaves falling”) would begin playing 1.50 before the end of playback of the primary action 20 (“top of the morning”). Likewise, the timing value of E−0.50 seconds for thesecondary action 24 means that the media associated with this secondary action (“birds chirping”) would begin to play 0.50 seconds before the end of playback of the primary action 20 (“top of the morning”) or 1.00 seconds after thesecondary action 22 begins. In addition, if the timing of a secondary action is based on a time X after the end of a primary action, the timing value can be represented, for example, as E+X; if the timing of the secondary action is based on a time X before the start of the primary action, the timing value can be represented, for example, as S−X; and if the timing of a secondary action is based on a time X after the start of a primary action, the timing value can be represented, for example, as S+X. - The display of related
secondary actions media cell 2 in this way correctly reflects their child-parent relationship to theprimary action 20 and their membership in the same group, represented by themedia cell 2. Also, the display of relatedsecondary actions media cell 2 is consistent with the flowchart metaphor used by the application. In particular, from a writer's standpoint, it may be desirable to be able to visualize the linear flow through the flowchart as if it were a normal sequential storyline in the experience. However, a flowchart may contain many different branching structures. A single path represents application flow along one of many possible paths in the flowchart during an actual runtime session. When a secondary action in media cell is selected, for examplesecondary action 22 in themedia cell 2 ofFIG. 1 , the secondary action is highlighted within the media cell, while the single path continues beyond the media cell along the path that contains it. This is consistent with the notion that playback in the flowchart should proceed sequentially from one cell to the next. - It should be understood that the application may implement different methods to determine a single path. For example, the selection of a single cell can be used to isolate the cells above and below the selected cell to form a single path based on the history of the cell and the history of the cells above and below it in succession. Alternatively, a path can be determined based on the most-frequently selected path containing the selected cell or the application can choose the path that most recently contained the selected cell. Other methods can be used, for example, the application can randomly determine a path comprising the selected cell or can semi-randomly determine a path comprising the selected cell, such as when part of the path is selected based upon some form of logic (e.g., most frequently selected links three cells above and below the selected cell), but the rest of the path is selected at random (e.g., all other links are selected randomly). The application may also implement different techniques to identify the cells in a single path. For example, cells along a single path may be identified by displaying the lines linking the cells differently (e.g., in a different color, shading, or thickness) than the lines linking cells that are not along the path. Other techniques can be used to identify cells along a path, for example, the borders of the cells along a path and their branches may be displayed with thicker lines in the
first display region 100. - Additionally, the content of the cells along a single path is displayed in the second display region 200 (the Editor Pane) to allow a user to read through the content of those cells in isolation from the cells in the other paths. As shown in
FIG. 1 , thesecond display region 200 contains a textual view 250 of the cells 1-7 in the order in which the cells appear in theflowchart 150. The textual view generally shows a “description” of the primary action of a cell (e.g., the text that is contained in the primary action, the line of dialogue that will be voiced by the primary action, the description of the animation or of the SFX that will be played by the primary action, etc.). The textual view also may include “descriptions” of the secondary actions of the cell. The content displayed in the textual view 250 can also contain other cell “properties,” such as the font of displayed text, the name of the actor who will provide the media content, the cell's timing relative to other cells, the volume at which audio content of the cell should be played, the frames per second of animation, and other information. It should be noted that the “textual” view can also contain graphics. For example, the text of the cell can also be supported with a graphics button or thumbnail image next to the text that indicates the type of media in the cell (e.g., a button with a little horn indicating a SFX) and is clickable to open a properties window of the cell, to playback the media, or to perform other functions. - In the present embodiment, editing of the
secondary actions flowchart 150 displayed inFIG. 2 , a user uses a pointing device (such as a mouse or trackball) to move apointer 30 over themedia cell 2, and then selects that cell by pressing a selector switch (such as the mouse button). Alternatively, the user can select themedia cell 2 using any other type of user interface device. For example, if the cells are numbered, the user can select themedia cell 2 by typing in the cell number using a keyboard or by speaking the number of the cell into a microphone. Additionally, instead of the user selecting themedia cell 2, the application can automatically select the media cell 2 (e.g., based on the output of some operation being run by the application or based on the output sent to the application by another application). A user can then select one of thesecondary actions media cell 2 using the pointing device or another user interface device as described above in order to edit their content and/or properties, including script description, timing properties, and visual display properties. For example, as shown inFIG. 3 , a user selects thesecondary action 22 and changes its content from “leaves falling” to “rain drops keep falling” in-place within themedia cell 2. Then, as shown inFIG. 4 , the user changes the timing value for this secondary action from E−1.50 seconds to E−0.20 seconds. - Alternatively, when the user selects a secondary action, a third display region can be provided for editing its content and/or properties. For example,
FIG. 5 illustrates amedia cell 9, including aprimary action 90 andsecondary actions 92 to 96, and adisplay region 300 or “properties window” that displays and allows a user to edit the content and/or properties of theactions 90 to 96. In the example, thedisplay region 300 displaysdifferent properties secondary action 94 selected by the user. Thedisplay region 300 receives user input, and the application can apply the input received to thesecondary action 94 in themedia cell 9. - Referring again to
FIGS. 3 and 4 , the application automatically rearranges secondary actions within a media cell to reflect their chronological order of execution. As shown inFIG. 4 , when the timing of the secondary action 22 (“leaves falling”) is changed from E−1.50 seconds to E−0.20 seconds, the order of execution of thesecondary action secondary action 22 would now be executed 0.30 seconds after the secondary action 24 (“birds chirping”) begins. As a result, the application automatically shifts thesecondary action 22 to fall below the secondary action 24 (“birds chirping”). Therefore, if the timing of a secondary action is updated, so is its order in the list of secondary actions. This is consistent with the notion that playback in the flowchart should proceed sequentially from one action to the next. - While according to the present embodiment the
primary action 20 and thesecondary actions media cell 2, it should be understood that the application may implement other methods to identify and/or to differentiate the secondary actions of a media cell from the primary action of the cell. For example, thesecondary actions first display region 100 adjacent or proximate themedia cell 2 according to their chronological order of execution. The positioning of thesecondary actions media cell 2 would identify their child-parent relationship with theprimary action 20 of themedia cell 2. Visual cues can also be provided, for example, links connecting each thesecondary actions primary action 20, so as to visually reinforce the child-parent relationship between thesecondary action primary action 20 and their membership in the same group. Alternatively, the textual view 250 in thesecond display region 200 can show both a description of theprimary action 20 and the descriptions of thesecondary action primary action 20 in a first column and thesecondary action secondary actions media cell 2 can be displayed in a fourth display region separate from the first andsecond display regions media cell 2 to reinforce the child-parent relationship between thesecondary action primary action 20 and their membership in the same group. - It will be understood that the application can be equipped with various functionality to allow it to facilitate the construction of the media assets scripted by the writer and for providing the programming necessary to fully render the interactive multimedia experience on a given platform. As noted above, cells can contain actions to be implemented, including instructions to play a media asset such as an audio file or a video file. When a writer is scripting the content of the interactive multimedia experience, those media assets may not yet exist. The application can sort the various pieces of desired media based on the talent that is necessary to create the media or on other criteria for sorting. For example, the actions in a media cell can be divided into music tasks, animation tasks, art tasks, programming tasks, SFX tasks, writing tasks, video tasks, and performance tasks. In this way, artists used to create the media can be assigned a list of tasks that need to be performed. When each of the media assets is created by the artists and inserted into designated “slots” in a database, the application can assemble the completed media for playback. The slots in the database can be created before or after the media assets are created. Because an interactive multimedia experience can have thousands of assets and a non-linear structure, it is preferred that the application maintain a database to track the media assets.
- It is intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention.
Claims (27)
1. A method for identifying coordinated actions, the method comprising:
displaying a graphical view of a flowchart representing a multimedia experience;
defining, within the flowchart, a cell including a primary action configured to trigger a playback of a primary media asset and secondary actions configured to trigger a playback of secondary media assets, wherein each of the secondary actions has a timing coordinated with the primary action;
displaying a representation of the primary action;
displaying representations of the secondary actions in chronological order according to their timing; and
visually associating each of the representations of the secondary actions with the representation of the primary action.
2. The method of claim 1 further comprising changing the timing of at least one of the secondary actions and updating the display of the representations of the secondary actions in chronological order according to their timing.
3. The method of claim 1 , wherein at least one of the representations of the secondary actions includes a description of a secondary media asset triggered by a secondary action and a value for the timing of the secondary action.
4. The method of claim 1 further comprising selectively minimizing the display of the representations of the secondary actions.
5. The method of claim 1 , wherein the representations of the secondary actions are visually linked with the representation of the primary action within a graphical representation of the cell.
6. The method of claim 5 , wherein the representation of the primary action is displayed in a first display area of the graphical representation of the cell and the representations of the secondary actions are displayed in a second display area of the graphical representation of the cell.
7. The method of claim 6 , wherein the representations of the secondary actions include descriptions of the secondary media assets triggered by the secondary action and values for the timing of the secondary actions.
8. The method of claim 5 further comprising editing the secondary actions directly in the graphical representation of the cell.
9. The method of claim 8 , wherein editing the secondary actions directly in the graphical representation of the cell includes:
selecting a target secondary action in the graphical representation of the cell;
identifying the target secondary action by displaying a representation of the target secondary action differently from representations of other secondary actions; and
applying an input received in the cell to the target secondary action.
10. The method of claim 1 , wherein the graphical view of the flowchart is displayed in a first display region, and further comprising displaying a textual view of the cell in a second display region.
11. The method of claim 1 further comprising:
selecting a first target secondary action;
displaying a first properties window, including a description of the secondary media asset triggered by the first target secondary action and a value for the timing of the first target secondary action;
selecting a second target secondary action;
displaying a second properties window, including a description of the secondary media asset triggered by the second target secondary action and a value for the timing of the second target secondary action;
applying an input received in the first properties windows to the first target secondary action; and
applying an input received in the second properties windows to the second target secondary action.
12. The method of claim 1 further comprising:
selecting a plurality of target secondary actions;
displaying descriptions of the secondary media assets triggered by the target secondary actions and values for the timing of the target secondary actions in a properties window; and
applying an input received in the properties window to at least one of the target secondary actions.
13. The method of claim 1 , wherein at least one of the secondary actions is configured to trigger a presentation of text, and wherein displaying a representation of the at least one of the secondary actions includes displaying the text.
14. The method of claim 13 further comprising editing the text directly in the flowchart.
15. A method for identifying coordinated actions, the method comprising:
displaying a graphical view of a flowchart representing a multimedia experience, the flowchart including a cell representing a playback of coordinated media content;
defining, within the flowchart, a plurality of actions associated with the cell and configured to trigger the playback of coordinated media content, including a primary action and at least one secondary action having a timing coordinated with the primary action; and
differentiating within a graphical representation of the cell a display of the primary action and a display of the at least one secondary action.
16. The method of claim 15 , wherein the at least one secondary action includes a plurality of secondary actions, and further comprising sorting the secondary actions in chronological order according to their timing.
17. The method of claim 15 , wherein the display of the at least one secondary action includes displaying a description of media content associated with the at least one secondary action and a value for the timing of the at least one secondary action.
18. The method of claim 15 , wherein differentiating within the graphical representation of the cell the display of the primary action and the display of the at least one secondary action includes displaying the primary action in a first display area of the cell and displaying the at least one secondary action in a second display area of the cell proximate the first display area.
19. The method of claim 18 further comprising selectively minimizing the display of the at least one secondary action.
20. The method of claim 15 further comprising editing the at least one secondary action directly in the cell.
21. The method of claim 15 , wherein the least one secondary action is configured to trigger a presentation of text, and wherein the display of the at least one secondary action includes the text.
22. The method of claim 21 further comprising editing the text directly in the flowchart.
23. A method for identifying coordinated actions, the method comprising:
displaying a graphical view of a flowchart representing a multimedia experience;
defining, within the flowchart, a cell including a primary action configured to trigger a playback of a primary media asset and secondary actions configured to trigger a playback of secondary media assets,
associating each of the secondary actions with a timing coordinated with the primary action, wherein at least one secondary action is associated with a timing defined by a time relative to and prior to the end of the playback of the primary media asset triggered by the primary action;
displaying a representation of the primary action;
displaying representations of the secondary actions; and
visually associating each of the representations of the secondary actions with the representation of the primary action.
24. The method of claim 23 , wherein at least one other secondary action is associated with a timing defined by a time relative to a start of the playback of the primary media asset.
25. The method of claim 23 , wherein the representations of the secondary actions are displayed in chronological order according to their timing.
26. The method of claim 23 further comprising changing a timing of at least one of the secondary actions and updating the display of the representations of the secondary actions in chronological order according to their timing.
27. The method of claim 23 , wherein a representation of a secondary action includes a description of the secondary media asset triggered by the secondary action and a value for the timing of the secondary action.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/957,076 US20080184143A1 (en) | 2006-12-14 | 2007-12-14 | Methods for Identifying Actions in a Flowchart |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US87517906P | 2006-12-14 | 2006-12-14 | |
US87507106P | 2006-12-14 | 2006-12-14 | |
US11/957,076 US20080184143A1 (en) | 2006-12-14 | 2007-12-14 | Methods for Identifying Actions in a Flowchart |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080184143A1 true US20080184143A1 (en) | 2008-07-31 |
Family
ID=39585829
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/957,076 Abandoned US20080184143A1 (en) | 2006-12-14 | 2007-12-14 | Methods for Identifying Actions in a Flowchart |
US11/957,066 Active 2030-01-13 US8127238B2 (en) | 2006-12-14 | 2007-12-14 | System and method for controlling actions within a programming environment |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/957,066 Active 2030-01-13 US8127238B2 (en) | 2006-12-14 | 2007-12-14 | System and method for controlling actions within a programming environment |
Country Status (1)
Country | Link |
---|---|
US (2) | US20080184143A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070255747A1 (en) * | 2006-04-27 | 2007-11-01 | Samsung Electronics Co., Ltd. | System, method and medium browsing media content using meta data |
US20080065977A1 (en) * | 2002-01-02 | 2008-03-13 | Gottlieb Harry N | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US20080104121A1 (en) * | 2006-10-31 | 2008-05-01 | Gottlieb Harry N | Methods For Preloading Media Assets |
US20090172559A1 (en) * | 2007-12-28 | 2009-07-02 | Microsoft Corporation | Creating and editing dynamic graphics via a web interface |
US20090259790A1 (en) * | 2008-04-15 | 2009-10-15 | Razer (Asia-Pacific) Pte Ltd | Ergonomic slider-based selector |
US20100281359A1 (en) * | 2009-04-30 | 2010-11-04 | International Business Machines Corporation | Method, apparatus and system for processing graphic objects |
US8127238B2 (en) | 2006-12-14 | 2012-02-28 | The Jellyvision Lab, Inc. | System and method for controlling actions within a programming environment |
US8276058B2 (en) | 2007-02-08 | 2012-09-25 | The Jellyvision Lab, Inc. | Method of automatically populating and generating flowerchart cells |
US20160358627A1 (en) * | 2015-06-05 | 2016-12-08 | Disney Enterprises, Inc. | Script-based multimedia presentation |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8276098B2 (en) | 2006-12-22 | 2012-09-25 | Apple Inc. | Interactive image thumbnails |
US9142253B2 (en) * | 2006-12-22 | 2015-09-22 | Apple Inc. | Associating keywords to media |
US9542376B2 (en) * | 2013-12-11 | 2017-01-10 | Sehul S. SHAH | System and method for creating, editing, and navigating one or more flowcharts |
WO2018129383A1 (en) * | 2017-01-09 | 2018-07-12 | Inmusic Brands, Inc. | Systems and methods for musical tempo detection |
Citations (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4538188A (en) * | 1982-12-22 | 1985-08-27 | Montage Computer Corporation | Video composition method and apparatus |
US4831525A (en) * | 1984-01-30 | 1989-05-16 | Hitachi, Ltd. | Method of generating a source program from inputted schematic information |
US4831580A (en) * | 1985-07-12 | 1989-05-16 | Nippon Electric Industry Co., Ltd. | Program generator |
US4852047A (en) * | 1987-04-14 | 1989-07-25 | Universal Automation Inc. | Continuous flow chart, improved data format and debugging system for programming and operation of machines |
US4875187A (en) * | 1986-07-31 | 1989-10-17 | British Telecommunications, Plc | Processing apparatus for generating flow charts |
US4893256A (en) * | 1986-04-04 | 1990-01-09 | International Business Machines Corporation | Interactive video composition and presentation systems |
US4931950A (en) * | 1988-07-25 | 1990-06-05 | Electric Power Research Institute | Multimedia interface and method for computer system |
US4933880A (en) * | 1988-06-15 | 1990-06-12 | International Business Machines Corp. | Method for dynamically processing non-text components in compound documents |
US5111409A (en) * | 1989-07-21 | 1992-05-05 | Elon Gasper | Authoring and use systems for sound synchronized animation |
US5261041A (en) * | 1990-12-28 | 1993-11-09 | Apple Computer, Inc. | Computer controlled animation system based on definitional animated objects and methods of manipulating same |
US5307456A (en) * | 1990-12-04 | 1994-04-26 | Sony Electronics, Inc. | Integrated multi-media production and authoring system |
US5386508A (en) * | 1990-08-24 | 1995-01-31 | Fuji Xerox Co., Ltd. | Apparatus for generating programs from inputted flowchart images |
US5418622A (en) * | 1992-10-27 | 1995-05-23 | Sony Corporation | Apparatus for recording and reproducing audio and video signals in accordance with a broadcast schedule |
US5430872A (en) * | 1993-03-12 | 1995-07-04 | Asymetrix Corporation | Verifying multimedia linking for a multimedia presentation |
US5446911A (en) * | 1989-11-16 | 1995-08-29 | Sharp Kabushiki Kaisha | Apparatus for automatically preparing a flowchart by analyzing a source program and a list of jump commands |
US5515490A (en) * | 1993-11-05 | 1996-05-07 | Xerox Corporation | Method and system for temporally formatting data presentation in time-dependent documents |
US5519828A (en) * | 1991-08-02 | 1996-05-21 | The Grass Valley Group Inc. | Video editing operator interface for aligning timelines |
US5546529A (en) * | 1994-07-28 | 1996-08-13 | Xerox Corporation | Method and apparatus for visualization of database search results |
US5555357A (en) * | 1994-06-03 | 1996-09-10 | Claris Corporation | Computer system and method for generating and manipulating charts and diagrams |
US5581759A (en) * | 1990-04-02 | 1996-12-03 | Hitachi, Ltd. | Apparatus and method for controlling a system process |
US5586311A (en) * | 1994-02-14 | 1996-12-17 | American Airlines, Inc. | Object oriented data access and analysis system |
US5590253A (en) * | 1994-07-14 | 1996-12-31 | Fanuc Ltd. | Sequence program edit system for inserting additional graphic elements and for automatically inserting vertical connector lines |
US5592602A (en) * | 1994-05-17 | 1997-01-07 | Macromedia, Inc. | User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display |
US5619636A (en) * | 1994-02-17 | 1997-04-08 | Autodesk, Inc. | Multimedia publishing system |
US5630017A (en) * | 1991-02-19 | 1997-05-13 | Bright Star Technology, Inc. | Advanced tools for speech synchronized animation |
US5640590A (en) * | 1992-11-18 | 1997-06-17 | Canon Information Systems, Inc. | Method and apparatus for scripting a text-to-speech-based multimedia presentation |
US5692212A (en) * | 1994-06-22 | 1997-11-25 | Roach; Richard Gregory | Interactive multimedia movies and techniques |
US5697788A (en) * | 1994-10-11 | 1997-12-16 | Aleph Logic Ltd. | Algorithm training system |
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US5717879A (en) * | 1995-11-03 | 1998-02-10 | Xerox Corporation | System for the capture and replay of temporal data representing collaborative activities |
US5721959A (en) * | 1988-07-01 | 1998-02-24 | Canon Kabushiki Kaisha | Information processing apparatus for pattern editing using logic relationship representative patterns |
US5752029A (en) * | 1992-04-10 | 1998-05-12 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using references to tracks in the composition to define components of the composition |
US5806079A (en) * | 1993-11-19 | 1998-09-08 | Smartpatents, Inc. | System, method, and computer program product for using intelligent notes to organize, link, and manipulate disparate data objects |
US5818435A (en) * | 1994-06-10 | 1998-10-06 | Matsushita Electric Indusrial | Multimedia data presentation device and editing device with automatic default selection of scenes |
US5870768A (en) * | 1994-04-29 | 1999-02-09 | International Business Machines Corporation | Expert system and method employing hierarchical knowledge base, and interactive multimedia/hypermedia applications |
US5893105A (en) * | 1996-11-12 | 1999-04-06 | Micrografx, Inc. | Executable flowchart |
US5892507A (en) * | 1995-04-06 | 1999-04-06 | Avid Technology, Inc. | Computer system for authoring a multimedia composition using a visual representation of the multimedia composition |
US5905981A (en) * | 1996-12-09 | 1999-05-18 | Microsoft Corporation | Automatically associating archived multimedia content with current textual content |
US6034692A (en) * | 1996-08-01 | 2000-03-07 | U.S. Philips Corporation | Virtual environment navigation |
US6058333A (en) * | 1996-08-27 | 2000-05-02 | Steeplechase Software, Inc. | Animation of execution history |
US6072480A (en) * | 1997-11-05 | 2000-06-06 | Microsoft Corporation | Method and apparatus for controlling composition and performance of soundtracks to accompany a slide show |
US6097887A (en) * | 1997-10-27 | 2000-08-01 | Kla-Tencor Corporation | Software system and method for graphically building customized recipe flowcharts |
US6100881A (en) * | 1997-10-22 | 2000-08-08 | Gibbons; Hugh | Apparatus and method for creating interactive multimedia presentation using a shoot lost to keep track of audio objects of a character |
US6144938A (en) * | 1998-05-01 | 2000-11-07 | Sun Microsystems, Inc. | Voice user interface with personality |
US6154600A (en) * | 1996-08-06 | 2000-11-28 | Applied Magic, Inc. | Media editor for non-linear editing system |
US6179490B1 (en) * | 1993-12-23 | 2001-01-30 | Telefonaktiebolaget Lm Ericsson | Method and apparatus for creating a flowchart using a programmed computer which will automatically result in a structured program |
US6184879B1 (en) * | 1996-04-26 | 2001-02-06 | Matsushita Electric Industrial Co., Ltd. | Multi-media title editing apparatus and a style creation device employed therefor |
US6212674B1 (en) * | 1996-04-22 | 2001-04-03 | Alcatel | Graphic control process for controlling operations in a network management system |
US6239800B1 (en) * | 1997-12-15 | 2001-05-29 | International Business Machines Corporation | Method and apparatus for leading a user through a software installation procedure via interaction with displayed graphs |
US6243857B1 (en) * | 1998-02-17 | 2001-06-05 | Nemasoft, Inc. | Windows-based flowcharting and code generation system |
US6314569B1 (en) * | 1998-11-25 | 2001-11-06 | International Business Machines Corporation | System for video, audio, and graphic presentation in tandem with video/audio play |
US6346945B1 (en) * | 1998-12-28 | 2002-02-12 | Klocwork Solutions | Method and apparatus for pattern-based flowcharting of source code |
US6356867B1 (en) * | 1998-11-26 | 2002-03-12 | Creator Ltd. | Script development systems and methods useful therefor |
US20020038206A1 (en) * | 2000-05-04 | 2002-03-28 | Dov Dori | Modeling system |
US6370683B1 (en) * | 1999-05-07 | 2002-04-09 | Arnold Sobers | Computer software for generating flowchart images of a source program |
US20020089527A1 (en) * | 2001-01-08 | 2002-07-11 | Paradies James I. | System and method for generating an MNP flowchart (multi-nodal-progression) |
US6421821B1 (en) * | 1999-03-10 | 2002-07-16 | Ronald J. Lavallee | Flow chart-based programming method and system for object-oriented languages |
US20020140731A1 (en) * | 2001-03-28 | 2002-10-03 | Pavitra Subramaniam | Engine to present a user interface based on a logical structure, such as one for a customer relationship management system, across a web site |
US6496208B1 (en) * | 1998-09-10 | 2002-12-17 | Microsoft Corporation | Method and apparatus for visualizing and exploring large hierarchical structures |
US6654803B1 (en) * | 1999-06-30 | 2003-11-25 | Nortel Networks Limited | Multi-panel route monitoring graphical user interface, system and method |
US6674955B2 (en) * | 1997-04-12 | 2004-01-06 | Sony Corporation | Editing device and editing method |
US20040070594A1 (en) * | 1997-07-12 | 2004-04-15 | Burke Trevor John | Method and apparatus for programme generation and classification |
US6754540B1 (en) * | 2000-07-24 | 2004-06-22 | Entivity, Inc. | Flowchart-based control system including external functions |
US20040196310A1 (en) * | 2000-11-01 | 2004-10-07 | Microsoft Corporation | System and method for creating customizable nodes in a network diagram |
US6816174B2 (en) * | 2000-12-18 | 2004-11-09 | International Business Machines Corporation | Method and apparatus for variable density scroll area |
US6873344B2 (en) * | 2001-02-22 | 2005-03-29 | Sony Corporation | Media production system using flowgraph representation of operations |
US6973639B2 (en) * | 2000-01-25 | 2005-12-06 | Fujitsu Limited | Automatic program generation technology using data structure resolution unit |
US20070227537A1 (en) * | 2005-12-02 | 2007-10-04 | Nellcor Puritan Bennett Incorporated | Systems and Methods for Facilitating Management of Respiratory Care |
US20070234196A1 (en) * | 1999-04-14 | 2007-10-04 | Verizon Corporate Services Group Inc. | Methods and systems for selection of multimedia presentations |
US20070240046A1 (en) * | 2001-11-20 | 2007-10-11 | Heung-Wah Yan | Method and apparatus for controlling view navigation in workflow systems |
US7310784B1 (en) * | 2002-01-02 | 2007-12-18 | The Jellyvision Lab, Inc. | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US7340715B2 (en) * | 2004-03-10 | 2008-03-04 | Hanbai Liu | Visual programming method and system thereof |
US20080104121A1 (en) * | 2006-10-31 | 2008-05-01 | Gottlieb Harry N | Methods For Preloading Media Assets |
US20080244376A1 (en) * | 2007-02-08 | 2008-10-02 | Gottlieb Harry N | Method of automatically populating and generating flowchart cells |
US7647577B2 (en) * | 2004-05-31 | 2010-01-12 | International Business Machines Corporation | Editing, creating, and verifying reorganization of flowchart, and transforming between flowchart and tree diagram |
US7793219B1 (en) * | 2006-12-07 | 2010-09-07 | Adobe Systems Inc. | Construction of multimedia compositions |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63127333A (en) | 1986-11-17 | 1988-05-31 | Omron Tateisi Electronics Co | Flowchart type programming device |
US5119474A (en) | 1989-06-16 | 1992-06-02 | International Business Machines Corp. | Computer-based, audio/visual creation and presentation system and method |
US5218672A (en) | 1990-01-19 | 1993-06-08 | Sony Corporation Of America | Offline editing system with user interface for controlling edit list generation |
JPH05257666A (en) | 1992-03-13 | 1993-10-08 | Hitachi Ltd | Automatic flowchart generating method |
JPH0896001A (en) | 1994-09-22 | 1996-04-12 | Hitachi Software Eng Co Ltd | Flowchart editing device |
AU4688996A (en) | 1994-12-22 | 1996-07-10 | Bell Atlantic Network Services, Inc. | Authoring tools for multimedia application development and network delivery |
JPH11506574A (en) | 1995-02-23 | 1999-06-08 | アヴィッド・テクノロジー・インコーポレーテッド | Combining editing and digital video recording systems |
JP3307790B2 (en) | 1995-03-24 | 2002-07-24 | 日立ソフトウエアエンジニアリング株式会社 | Flow diagram creation device |
JP4670136B2 (en) | 2000-10-11 | 2011-04-13 | ソニー株式会社 | Authoring system, authoring method, and storage medium |
EP1669996A3 (en) | 2001-06-14 | 2006-07-05 | Samsung Electronics Co., Ltd. | Information strorage medium containing preload information, apparatus and method for reproducing therefor |
US20080184143A1 (en) | 2006-12-14 | 2008-07-31 | Gottlieb Harry N | Methods for Identifying Actions in a Flowchart |
-
2007
- 2007-12-14 US US11/957,076 patent/US20080184143A1/en not_active Abandoned
- 2007-12-14 US US11/957,066 patent/US8127238B2/en active Active
Patent Citations (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4538188A (en) * | 1982-12-22 | 1985-08-27 | Montage Computer Corporation | Video composition method and apparatus |
US4831525A (en) * | 1984-01-30 | 1989-05-16 | Hitachi, Ltd. | Method of generating a source program from inputted schematic information |
US4956773A (en) * | 1984-01-30 | 1990-09-11 | Hitachi, Ltd. | System and method of generating a source program from inputted schematic information |
US4831580A (en) * | 1985-07-12 | 1989-05-16 | Nippon Electric Industry Co., Ltd. | Program generator |
US4893256A (en) * | 1986-04-04 | 1990-01-09 | International Business Machines Corporation | Interactive video composition and presentation systems |
US4875187A (en) * | 1986-07-31 | 1989-10-17 | British Telecommunications, Plc | Processing apparatus for generating flow charts |
US4852047A (en) * | 1987-04-14 | 1989-07-25 | Universal Automation Inc. | Continuous flow chart, improved data format and debugging system for programming and operation of machines |
US4933880A (en) * | 1988-06-15 | 1990-06-12 | International Business Machines Corp. | Method for dynamically processing non-text components in compound documents |
US5721959A (en) * | 1988-07-01 | 1998-02-24 | Canon Kabushiki Kaisha | Information processing apparatus for pattern editing using logic relationship representative patterns |
US4931950A (en) * | 1988-07-25 | 1990-06-05 | Electric Power Research Institute | Multimedia interface and method for computer system |
US5111409A (en) * | 1989-07-21 | 1992-05-05 | Elon Gasper | Authoring and use systems for sound synchronized animation |
US5446911A (en) * | 1989-11-16 | 1995-08-29 | Sharp Kabushiki Kaisha | Apparatus for automatically preparing a flowchart by analyzing a source program and a list of jump commands |
US5581759A (en) * | 1990-04-02 | 1996-12-03 | Hitachi, Ltd. | Apparatus and method for controlling a system process |
US5386508A (en) * | 1990-08-24 | 1995-01-31 | Fuji Xerox Co., Ltd. | Apparatus for generating programs from inputted flowchart images |
US5307456A (en) * | 1990-12-04 | 1994-04-26 | Sony Electronics, Inc. | Integrated multi-media production and authoring system |
US5261041A (en) * | 1990-12-28 | 1993-11-09 | Apple Computer, Inc. | Computer controlled animation system based on definitional animated objects and methods of manipulating same |
US5630017A (en) * | 1991-02-19 | 1997-05-13 | Bright Star Technology, Inc. | Advanced tools for speech synchronized animation |
US5519828A (en) * | 1991-08-02 | 1996-05-21 | The Grass Valley Group Inc. | Video editing operator interface for aligning timelines |
US5752029A (en) * | 1992-04-10 | 1998-05-12 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using references to tracks in the composition to define components of the composition |
US5754851A (en) * | 1992-04-10 | 1998-05-19 | Avid Technology, Inc. | Method and apparatus for representing and editing multimedia compositions using recursively defined components |
US5418622A (en) * | 1992-10-27 | 1995-05-23 | Sony Corporation | Apparatus for recording and reproducing audio and video signals in accordance with a broadcast schedule |
US5640590A (en) * | 1992-11-18 | 1997-06-17 | Canon Information Systems, Inc. | Method and apparatus for scripting a text-to-speech-based multimedia presentation |
US5430872A (en) * | 1993-03-12 | 1995-07-04 | Asymetrix Corporation | Verifying multimedia linking for a multimedia presentation |
US5530856A (en) * | 1993-03-12 | 1996-06-25 | Asymetrix Corporation | Verifying multimedia linking for a multimedia presentation |
US5515490A (en) * | 1993-11-05 | 1996-05-07 | Xerox Corporation | Method and system for temporally formatting data presentation in time-dependent documents |
US5806079A (en) * | 1993-11-19 | 1998-09-08 | Smartpatents, Inc. | System, method, and computer program product for using intelligent notes to organize, link, and manipulate disparate data objects |
US6179490B1 (en) * | 1993-12-23 | 2001-01-30 | Telefonaktiebolaget Lm Ericsson | Method and apparatus for creating a flowchart using a programmed computer which will automatically result in a structured program |
US5586311A (en) * | 1994-02-14 | 1996-12-17 | American Airlines, Inc. | Object oriented data access and analysis system |
US5619636A (en) * | 1994-02-17 | 1997-04-08 | Autodesk, Inc. | Multimedia publishing system |
US5870768A (en) * | 1994-04-29 | 1999-02-09 | International Business Machines Corporation | Expert system and method employing hierarchical knowledge base, and interactive multimedia/hypermedia applications |
US5592602A (en) * | 1994-05-17 | 1997-01-07 | Macromedia, Inc. | User interface and method for controlling and displaying multimedia motion, visual, and sound effects of an object on a display |
US5555357A (en) * | 1994-06-03 | 1996-09-10 | Claris Corporation | Computer system and method for generating and manipulating charts and diagrams |
US5818435A (en) * | 1994-06-10 | 1998-10-06 | Matsushita Electric Indusrial | Multimedia data presentation device and editing device with automatic default selection of scenes |
US5692212A (en) * | 1994-06-22 | 1997-11-25 | Roach; Richard Gregory | Interactive multimedia movies and techniques |
US5590253A (en) * | 1994-07-14 | 1996-12-31 | Fanuc Ltd. | Sequence program edit system for inserting additional graphic elements and for automatically inserting vertical connector lines |
US5546529A (en) * | 1994-07-28 | 1996-08-13 | Xerox Corporation | Method and apparatus for visualization of database search results |
US5697788A (en) * | 1994-10-11 | 1997-12-16 | Aleph Logic Ltd. | Algorithm training system |
US5892507A (en) * | 1995-04-06 | 1999-04-06 | Avid Technology, Inc. | Computer system for authoring a multimedia composition using a visual representation of the multimedia composition |
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US5717879A (en) * | 1995-11-03 | 1998-02-10 | Xerox Corporation | System for the capture and replay of temporal data representing collaborative activities |
US6212674B1 (en) * | 1996-04-22 | 2001-04-03 | Alcatel | Graphic control process for controlling operations in a network management system |
US6184879B1 (en) * | 1996-04-26 | 2001-02-06 | Matsushita Electric Industrial Co., Ltd. | Multi-media title editing apparatus and a style creation device employed therefor |
US6034692A (en) * | 1996-08-01 | 2000-03-07 | U.S. Philips Corporation | Virtual environment navigation |
US6154600A (en) * | 1996-08-06 | 2000-11-28 | Applied Magic, Inc. | Media editor for non-linear editing system |
US6058333A (en) * | 1996-08-27 | 2000-05-02 | Steeplechase Software, Inc. | Animation of execution history |
US5893105A (en) * | 1996-11-12 | 1999-04-06 | Micrografx, Inc. | Executable flowchart |
US5905981A (en) * | 1996-12-09 | 1999-05-18 | Microsoft Corporation | Automatically associating archived multimedia content with current textual content |
US6674955B2 (en) * | 1997-04-12 | 2004-01-06 | Sony Corporation | Editing device and editing method |
US20040070594A1 (en) * | 1997-07-12 | 2004-04-15 | Burke Trevor John | Method and apparatus for programme generation and classification |
US6100881A (en) * | 1997-10-22 | 2000-08-08 | Gibbons; Hugh | Apparatus and method for creating interactive multimedia presentation using a shoot lost to keep track of audio objects of a character |
US6097887A (en) * | 1997-10-27 | 2000-08-01 | Kla-Tencor Corporation | Software system and method for graphically building customized recipe flowcharts |
US6072480A (en) * | 1997-11-05 | 2000-06-06 | Microsoft Corporation | Method and apparatus for controlling composition and performance of soundtracks to accompany a slide show |
US6239800B1 (en) * | 1997-12-15 | 2001-05-29 | International Business Machines Corporation | Method and apparatus for leading a user through a software installation procedure via interaction with displayed graphs |
US6243857B1 (en) * | 1998-02-17 | 2001-06-05 | Nemasoft, Inc. | Windows-based flowcharting and code generation system |
US6334103B1 (en) * | 1998-05-01 | 2001-12-25 | General Magic, Inc. | Voice user interface with personality |
US6144938A (en) * | 1998-05-01 | 2000-11-07 | Sun Microsystems, Inc. | Voice user interface with personality |
US6496208B1 (en) * | 1998-09-10 | 2002-12-17 | Microsoft Corporation | Method and apparatus for visualizing and exploring large hierarchical structures |
US6314569B1 (en) * | 1998-11-25 | 2001-11-06 | International Business Machines Corporation | System for video, audio, and graphic presentation in tandem with video/audio play |
US6356867B1 (en) * | 1998-11-26 | 2002-03-12 | Creator Ltd. | Script development systems and methods useful therefor |
US6346945B1 (en) * | 1998-12-28 | 2002-02-12 | Klocwork Solutions | Method and apparatus for pattern-based flowcharting of source code |
US6421821B1 (en) * | 1999-03-10 | 2002-07-16 | Ronald J. Lavallee | Flow chart-based programming method and system for object-oriented languages |
US20070234196A1 (en) * | 1999-04-14 | 2007-10-04 | Verizon Corporate Services Group Inc. | Methods and systems for selection of multimedia presentations |
US6370683B1 (en) * | 1999-05-07 | 2002-04-09 | Arnold Sobers | Computer software for generating flowchart images of a source program |
US6654803B1 (en) * | 1999-06-30 | 2003-11-25 | Nortel Networks Limited | Multi-panel route monitoring graphical user interface, system and method |
US6973639B2 (en) * | 2000-01-25 | 2005-12-06 | Fujitsu Limited | Automatic program generation technology using data structure resolution unit |
US20020038206A1 (en) * | 2000-05-04 | 2002-03-28 | Dov Dori | Modeling system |
US6754540B1 (en) * | 2000-07-24 | 2004-06-22 | Entivity, Inc. | Flowchart-based control system including external functions |
US20040196310A1 (en) * | 2000-11-01 | 2004-10-07 | Microsoft Corporation | System and method for creating customizable nodes in a network diagram |
US6816174B2 (en) * | 2000-12-18 | 2004-11-09 | International Business Machines Corporation | Method and apparatus for variable density scroll area |
US20020089527A1 (en) * | 2001-01-08 | 2002-07-11 | Paradies James I. | System and method for generating an MNP flowchart (multi-nodal-progression) |
US6873344B2 (en) * | 2001-02-22 | 2005-03-29 | Sony Corporation | Media production system using flowgraph representation of operations |
US20020140731A1 (en) * | 2001-03-28 | 2002-10-03 | Pavitra Subramaniam | Engine to present a user interface based on a logical structure, such as one for a customer relationship management system, across a web site |
US20070240046A1 (en) * | 2001-11-20 | 2007-10-11 | Heung-Wah Yan | Method and apparatus for controlling view navigation in workflow systems |
US7310784B1 (en) * | 2002-01-02 | 2007-12-18 | The Jellyvision Lab, Inc. | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US20080065977A1 (en) * | 2002-01-02 | 2008-03-13 | Gottlieb Harry N | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US7340715B2 (en) * | 2004-03-10 | 2008-03-04 | Hanbai Liu | Visual programming method and system thereof |
US7647577B2 (en) * | 2004-05-31 | 2010-01-12 | International Business Machines Corporation | Editing, creating, and verifying reorganization of flowchart, and transforming between flowchart and tree diagram |
US20070227537A1 (en) * | 2005-12-02 | 2007-10-04 | Nellcor Puritan Bennett Incorporated | Systems and Methods for Facilitating Management of Respiratory Care |
US20080104121A1 (en) * | 2006-10-31 | 2008-05-01 | Gottlieb Harry N | Methods For Preloading Media Assets |
US7793219B1 (en) * | 2006-12-07 | 2010-09-07 | Adobe Systems Inc. | Construction of multimedia compositions |
US20080244376A1 (en) * | 2007-02-08 | 2008-10-02 | Gottlieb Harry N | Method of automatically populating and generating flowchart cells |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080065977A1 (en) * | 2002-01-02 | 2008-03-13 | Gottlieb Harry N | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US8464169B2 (en) | 2002-01-02 | 2013-06-11 | The Jellyvision Lab, Inc. | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart |
US20070255747A1 (en) * | 2006-04-27 | 2007-11-01 | Samsung Electronics Co., Ltd. | System, method and medium browsing media content using meta data |
US7930329B2 (en) * | 2006-04-27 | 2011-04-19 | Samsung Electronics Co., Ltd. | System, method and medium browsing media content using meta data |
US20080104121A1 (en) * | 2006-10-31 | 2008-05-01 | Gottlieb Harry N | Methods For Preloading Media Assets |
US8521709B2 (en) | 2006-10-31 | 2013-08-27 | The Jellyvision Lab, Inc. | Methods for preloading media assets |
US8127238B2 (en) | 2006-12-14 | 2012-02-28 | The Jellyvision Lab, Inc. | System and method for controlling actions within a programming environment |
US8276058B2 (en) | 2007-02-08 | 2012-09-25 | The Jellyvision Lab, Inc. | Method of automatically populating and generating flowerchart cells |
US9037974B2 (en) * | 2007-12-28 | 2015-05-19 | Microsoft Technology Licensing, Llc | Creating and editing dynamic graphics via a web interface |
US20090172559A1 (en) * | 2007-12-28 | 2009-07-02 | Microsoft Corporation | Creating and editing dynamic graphics via a web interface |
US20090259790A1 (en) * | 2008-04-15 | 2009-10-15 | Razer (Asia-Pacific) Pte Ltd | Ergonomic slider-based selector |
US8970496B2 (en) * | 2008-04-15 | 2015-03-03 | Razer (Asia-Pacific) Pte. Ltd. | Ergonomic slider-based selector |
US20100281359A1 (en) * | 2009-04-30 | 2010-11-04 | International Business Machines Corporation | Method, apparatus and system for processing graphic objects |
US9098940B2 (en) * | 2009-04-30 | 2015-08-04 | International Business Machines Corporation | Method, apparatus and system for processing graphic objects |
US20160358627A1 (en) * | 2015-06-05 | 2016-12-08 | Disney Enterprises, Inc. | Script-based multimedia presentation |
US9805036B2 (en) * | 2015-06-05 | 2017-10-31 | Disney Enterprises, Inc. | Script-based multimedia presentation |
Also Published As
Publication number | Publication date |
---|---|
US8127238B2 (en) | 2012-02-28 |
US20080163084A1 (en) | 2008-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080184143A1 (en) | Methods for Identifying Actions in a Flowchart | |
US11238412B2 (en) | Multimedia calendar | |
US8464169B2 (en) | Methods for identifying cells in a path in a flowchart and for synchronizing graphical and textual views of a flowchart | |
US10608974B2 (en) | Computer-based authoring tool for narrative content delivered by an interactive message-based delivery system | |
US9588663B2 (en) | System and method for integrating interactive call-to-action, contextual applications with videos | |
US20050071736A1 (en) | Comprehensive and intuitive media collection and management tool | |
US20050069225A1 (en) | Binding interactive multichannel digital document system and authoring tool | |
US7062712B2 (en) | Binding interactive multichannel digital document system | |
JP5307911B2 (en) | High density interactive media guide | |
US7565608B2 (en) | Animation on object user interface | |
US20100058220A1 (en) | Systems, methods, and computer program products for the creation, monetization, distribution, and consumption of metacontent | |
US20070294619A1 (en) | Generating media presentations | |
US20150007035A1 (en) | Method, system and user interface for creating and displaying of presentations | |
CN102227695A (en) | Audiovisual user interface based on learned user preferences | |
US20030174160A1 (en) | Interactive presentation viewing system employing multi-media components | |
US20040201610A1 (en) | Video player and authoring tool for presentions with tangential content | |
CN103092962B (en) | A kind of method and system issuing internet information | |
CN101986249A (en) | Method for controlling computer by using gesture object and corresponding computer system | |
US20210247882A1 (en) | Computer-based method and system for multidimensional media content organisation and presentation | |
WO2013016312A1 (en) | Web-based video navigation, editing and augmenting apparatus, system and method | |
US20110304627A1 (en) | Media exploration | |
CN112052315A (en) | Information processing method and device | |
WO2008076911A1 (en) | Methods for identifying actions in a flowchart | |
CN102572601A (en) | Display method and device for video information | |
CN115048010A (en) | Method, device, equipment and medium for displaying audiovisual works |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THE JELLYVISION LAB, INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOTTLIEB, HARRY N.;LOTT, EDWARD;JACOVER, EVAN;AND OTHERS;REEL/FRAME:021734/0658 Effective date: 20080324 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |