US20060277466A1 - Bimodal user interaction with a simulated object - Google Patents

Bimodal user interaction with a simulated object Download PDF

Info

Publication number
US20060277466A1
US20060277466A1 US11/433,173 US43317306A US2006277466A1 US 20060277466 A1 US20060277466 A1 US 20060277466A1 US 43317306 A US43317306 A US 43317306A US 2006277466 A1 US2006277466 A1 US 2006277466A1
Authority
US
United States
Prior art keywords
user
simulated
properties
holding
forces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/433,173
Inventor
Thomas Anderson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NOVINT TECHNOLOGIES Inc
Original Assignee
NOVINT TECHNOLOGIES Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NOVINT TECHNOLOGIES Inc filed Critical NOVINT TECHNOLOGIES Inc
Priority to US11/433,173 priority Critical patent/US20060277466A1/en
Priority to PCT/US2006/042557 priority patent/WO2007133251A2/en
Assigned to NOVINT TECHNOLOGIES, INC. reassignment NOVINT TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANDERSON, THOMAS G
Publication of US20060277466A1 publication Critical patent/US20060277466A1/en
Priority to US12/783,386 priority patent/US9804672B2/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to methods and apparatuses related to user interaction with computer-simulated objects, and more specifically to methods and apparatuses related to force feedback in user interaction with different object behavior dependent on whether or not a user is holding or controlling the object.
  • new interface methods are needed to fully utilize new modes of human-computer communication enabled.
  • new methods of interaction can use the additional human-computer communication paths to supplement or supplant conventional communication paths, freeing up traditional keyboard input and visual feedback bandwidth.
  • force feedback, or haptics can be especially useful in allowing a user to feel parts of the interface, reducing the need for a user to visually manage interface characteristics that can be managed by feel.
  • Users interfacing with non-computer tasks routinely exploit the combination of visual and haptic feedback (seeing one side of a task while feeling the other); bringing this sensory combination into human-computer interfaces can make such interfaces more efficient and more intuitive for the user. Accordingly, there is a need for new methods of human-computer interfacing that make appropriate use of haptic and visual feedback.
  • many contemporary computer games require the user to throw or otherwise propel an object.
  • the games typically allow the user to specify a throw by pressing a button or combination of buttons.
  • the timing of the button press often relative to the timing of other actions occurring in the game, controls the affect of the throw (e.g., the accuracy or distance of the throw).
  • Some games display a slider or power bar that indicates direction or force of a throw; the user must press the appropriate button when the slider or bar is at the right value for the desired throw.
  • the user can thereby control aspects of the throw, but not with any of the skills learned in real world throwing.
  • the direction of the user's hand motion and the force applied by the user near the release of the throw generally are not significant to the throwing action in the game.
  • the object being thrown is generally represented within the game independent of whether it is being held or has been released, forcing the user to adjust the control of the object to the constraints of the simulations in the game.
  • the present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions.
  • the method can provide for two distinct states: a “holding” state, and a “released” state.
  • the holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable).
  • the released state roughly corresponds to the user not holding the object.
  • a simple example of the two states can include the holding, then throwing of a ball.
  • the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
  • the present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both.
  • the present invention can also provide forces that represent both pushing and pulling the simulated object.
  • the present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
  • the present invention can also provide a method of representing a simulated object within a three dimensional computer simulation.
  • the object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object).
  • the properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof.
  • the simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
  • the present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object.
  • the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object.
  • the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment.
  • the present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
  • the present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions.
  • the method can provide for two distinct states: a “holding” state, and a “released” state.
  • the holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable).
  • the released state roughly corresponds to the user not holding the object.
  • a simple example of the two states can include the holding, then throwing of a ball.
  • the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
  • the present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both.
  • the present invention can also provide forces that represent both pushing and pulling the simulated object.
  • the present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
  • the present invention can also provide a method of representing a simulated object within a three dimensional computer simulation.
  • the object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object).
  • the properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof.
  • the simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
  • the present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object.
  • the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object.
  • the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment.
  • the present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
  • Haptics is the field that studies the sense of touch. In computing, haptics refers to giving a User a sense of touch through a Haptic Device.
  • a Haptic Device (or Device) is the mechanism that allows a User to feel virtual objects and sensations. The forces created from a Haptic Device can be controlled through motors or any other way of transferring sensations to a User.
  • the position of a Device typically refers to the position of a handle on the Device that is held by User. Any of the algorithms described can vary depending on where the handle of the Device is within its workspace.
  • Haptic Devices can have any number of Degrees of Freedom (DOF), and can have a different number of DOF for tracking than for forces.
  • DOF Degrees of Freedom
  • a Haptic Device can track 3 DOF (x, y, and z), and output forces in 3 DOF (x, y, and z), in which case the tracked DOF are the same as the forces DOF.
  • a Haptic Device can track 6 DOF (x, y, and z, and rotation about x, y, and z), but only have 3 DOF (x, y, and z), in which case the tracked DOF are a superset of the forces DOF.
  • any of a Device's DOF can be controlled by direct movements not relative to a fixed point in space (like a standard computer mouse), controlled by direct movements relative to a fixed point in space (like a mechanical tracker, mechanically grounded to a table it is resting on, where it can only move within a limited workspace), or controlled by forces against springs, movements around pivot points, twisting or turning a handle, or another type of limited movements (joystick, spaceball, etc).
  • a User is a person utilizing a Haptic Device to play a game or utilize some other type of application that gives a sense of touch.
  • the User can experience a simulation or game in ways that are consistent with a Character (described below) such that the User feels, sees, and does what the Character does.
  • the User can also have any portion or all of the interactions with the simulation be separate from the Character. For example, the User's view (i.e. what is seen on the monitor) does not have to be lined up with a Character's view (i.e. what a Character would see given the environment and the location of the Character's eyes), whether the Character is currently being controlled or not.
  • a Character is a person or object controlled by a User in a video game or application.
  • a Character can also be a first person view and representation of the User. Characters can be simple representations described only by graphics, or they can have complex characteristics such as Inverse Kinematic equations, body mass, muscles, energy, damage levels, artificial intelligence, or can represent any type of person, animal, or object in real life in varying degrees of realism.
  • a Character can be a complex system like a human, or it can simply be a simple geometric object like a marble in a marble controlling game. Characters and their information can be described and contained on a single computer, on multiple computers, and in online environments. Characters can interact with other Characters.
  • a Character can be controlled by the position of a Device or a Cursor, and a Character can be any Cursor or any object.
  • a Cursor is a virtual object controlled by a User controlling a Haptic Device. As the User moves the Haptic Device, the Cursor can move in some relationship to the movement of the Device. Typically, though not always, the Cursor moves proportionally to the movement of the Haptic Device along each axis (x,y,z). Those movements, however, can be scaled, rotated, or skewed or modified by any other function, and can be modified in these ways differently along different axes. For example, a Cursor can be controlled through a pivot point, where a movement of the Haptic Device to the right would move the Cursor to the left (the amount of movement can depend on a simulation of where the pivot point is located with respect to the Cursor).
  • a Cursor can be a point, a sphere, any other geometric shape, a polygonal surface, a volumetric representation, a solids model, a spline based object, or can be described in any other mathematical way.
  • a Cursor can also be a combination of any number of those objects.
  • the graphical, haptic, and sound representations of a Cursor can be different from each other.
  • a Cursor can be a perfect haptic sphere, but a polygonal graphical sphere.
  • a Cursor can be controlled directly, or can be controlled through the interactions of one or more virtual objects that interact with another virtual object or other virtual objects.
  • a Haptic Device can control a point that is connected to a sphere with a spring, where the sphere is used as the Cursor.
  • a Cursor's movements can be constrained in the visual, haptic, audio, or other sensory domain, preventing the Cursor, or its use, from moving into a specified area. Cursor movements can be constrained by objects, algorithms, or physical stops on a Device as examples.
  • the position of a Cursor and the forces created can be modified with any type of linear or non-linear transformation (for example, scaling in the x direction).
  • Position can be modified through transformations, and the forces created can be modified through an inverse function to modify the perception of Cursor movements, to modify objects (such as scale, rotation, etc), or to give interesting effects to the User.
  • a Cursor can have visual, haptic, and sound representations, and any properties of any of those three Cursor modalities can be different. For example, a Cursor can have different sound, graphic, and haptic representations. A Cursor does not need to be shown visually. Also, a Cursor can vary in any of the ways described above differently at different times. For example, a Cursor can have a consistent haptic and graphic position when not touching objects, but a different haptic and graphic position when objects are touched.
  • a Cursor can be shown graphically when preparing to perform an action (like beginning a snowboard run, beginning a golf swing, or selecting an object), and then not shown graphically when performing the action (snowboarding, swinging a golf club, or holding an object, respectively).
  • a Cursor can also be a representation of a User's interaction with a Character or an action, rather than a specific object used to touch other objects.
  • a Cursor can be the object used to implement a golf swing, and control the weight and feel of the club, even though the User never actually sees the Cursor directly.
  • a Cursor can change haptic and graphic characteristics as a function of time, or as a function of another variable.
  • a Cursor can be any object, any Character, or control any part of a Character or object.
  • a Cursor can be in the shape of a human hand, foot, or any other part or whole of a human, animal, or cartoon.
  • the shape, function, and motion of a Cursor can be related to that of an equivalent or similar object in the real world.
  • a Cursor shaped like a human hand can change wrist, hand and finger positioning in order to gesture, grasp objects, or otherwise interact with object similar to how hands interact with real objects.
  • An interface in many games must provide the user with a method of indicating discharge of an object, for example release of a thrown ball.
  • Conventional game interfaces use buttons or switches—unrelated to usual methods of releasing objects and consequently not a very realistic interface effect.
  • objects are thrown by imparting sufficient momentum to them.
  • a haptic interface can accommodate interaction that allows intuitive release.
  • One or more force membranes can be presented to the user, where a force membrane is a region of the haptic space accessible by the user that imposes a force opposing motion toward the membrane as the user approaches the membrane.
  • a membrane placed in the direction of the intended target can discourage the user from releasing the object accidentally, but can allow intentional release by application of sufficient force by the user to exceed the membrane's threshold.
  • a user throwing a football The user brings the haptic interaction device back (as if to cock the arm back to throw) past a membrane, then pushes it forward (feeling the weight and drag of the football haptically), and if the user pushes the football forward fast enough to give it the required momentum, the football is thrown.
  • Motion of the object in a throwing direction can be accompanied by a combination of dynamics forces and viscosity to guide the users movement. These forces can make directing the object thrown much easier.
  • the forces related to the membrane can drop abruptly when the object is thrown, or can be decreased over time, depending on the desired interface characteristics.
  • such a release mechanism can be used to throw balls or other objects (e.g., by pushing the object forward through a force barrier disposed between the user location and the target), to drop objects (e.g., by pushing the object downward through a force barrier between the user location and the floor), and to discharge weapons or blows (e.g., by pushing a representation of a weapon or character portion through a force barrier between the weapon or character portion and the target).
  • a membrane can apply an increasing force, and the object released when the user-applied force reaches a certain relation to the membrane's force (e.g., equals the maximum force, or is double the present force). Release can also be triggered by gesture recognition: a hand moving forward rapidly, then quickly slowing, can indicate a desired throw.
  • the direction of the object can be determined in various ways, some of which are discussed in more detail in U.S. provisional application 60/681,007, “Computer Interface Methods and Apparatuses,” filed May 13, 2005, incorporated herein by reference.
  • the position, at release, pre-release, or both can be used to set direction;
  • the object can be modeled as attached by a spring to the cursor, and the direction of throw determined from the relative positions.
  • a visual indication can be combined with the haptic information to indicate status of the release; for example, an image of an arm about to throw can be displayed in connection with the image of the ball when the ball is ready to be thrown (pulled through the first membrane in the previous example).
  • a haptic cue can also be used, for example a vibration in the device or perceptible bump in its motion.
  • Bimodal interface can provide for two separate modes of object simulation, optionally selectable responsive to direction from the user.
  • a first mode the user is directly controlling a simulated object. This can correspond to, as examples, the user holding the simulated object, the user holding a spring or other intermediate structure that in turn holds the object, or the user controlling the object at a distance (e.g., according to a magical levitation spell or a space-type tractor beam).
  • a second mode the user is not directly controlling the simulated object. This can correspond to, as examples, the user having released a thrown object, the user dropping an object, some interaction with the simulated environment causing the user to lose control of the object, or the distance control being disrupted (e.g., the spell is broken).
  • the object can be represented to the user and to the simulation with different interaction properties in the two states, where “interaction properties” that can be different are aspects of the user's perception of the object that change upon transition between the modes, including as examples the properties of the object in the simulation; the properties of the object as perceived by the user (e.g., texture, size); the rate of determining object characteristics (e.g., position or deformation); simulation properties (e.g., time scales, physics models); environment properties (e.g., gravitational constants, relative size of objects); and execution of a corresponding software program on a different thread or a different processor.
  • the transition between the states can be at any point where the user associates the state transition with the release of the object, including simultaneous with the release of an object by the user or before or after such release.
  • “Holding” mode In the first mode, the user can generally be allowed to hold the object, manipulate the object, and interact with the object and, through the object, with the simulated environment. After an object is touched or selected, a User can determine that the object should be held. This can be accomplished automatically (for example, an object can be automatically grabbed, when it is touched) or the object can be held or grabbed based on a User input such as a button press, a switch press, a voice input, a gesture that is recognized, or some other type of User input. When an object is held, it can have dynamics properties such as weight, inertia, momentum, or any other physical property. It can be implemented as a weight, and a User can feel the weight and reaction forces as it is moved.
  • dynamics properties such as weight, inertia, momentum, or any other physical property. It can be implemented as a weight, and a User can feel the weight and reaction forces as it is moved.
  • Objects that are held can have interactions with the environment. For example, a heavy object might need to be dragged across a virtual ground if it is too heavy to be picked up by the Character. In this case, as in real life, the User can feel the weight of the object, but that weight can be less than if the object was not touching the ground. As it is dragged across the ground, forces can be applied to the Cursor or object representing the friction or resistance of the object as it moves across the ground, bumps into things, or snags or gets caught on other virtual objects.
  • Objects that are held can also have forces applied to them (or directly to the Cursor) based on other virtual objects that interact with the object that is held. For example, a User might feed a virtual animal. As an apple is picked up the User might feel the weight of the apple, then, when a virtual horse bites the apple, the User can feel the apple being pulled and pushed, and the weight can be adjusted to reflect the material removed by the bite. Objects can be rotated while they are held. To rotate an object, a User can rotate the handle of the Device, or can modify the position of the Device so that the rotation of the object occurs based on a change in position of the Cursor.
  • Objects that are held can have other haptic characteristics.
  • a User could hold an object that is spinning and feel the resulting forces.
  • a virtual gyroscope could create directional forces that the User would feel.
  • a User can feel the acceleration of an object being held. For example, if a User holds a virtual firehose, the User might feel the force pushing back from the direction that firehose is spraying based on how much water is coming out, how close another virtual object is that is being sprayed, how tightly the hose is being held, how much pressure there is in the spraying, or other aspects of the simulation of the water being sprayed. If an object has its own forces, based on its representation, the User could feel them.
  • a User could feel the popcorn popping within it.
  • Each individual force of a popcorn popping could be relayed to the User, or the forces could be represented through some other representation (such as a random approximation of forces that a popcorn popper would create).
  • the forces that a User feels can have any form or representation which can represent any action or event.
  • a User might hold a virtual wand to cast spells. The User can feel any type of force through the wand, which can represent any type of spell.
  • the User can feel interactions of a virtual held object with another object that hits the object. For example, a User might feel the weight of a baseball bat, and a path constraint of how it can move. Then when the bat hits a baseball, that feeling can be felt by the User through a force applied to the bat or applied directly to the Cursor. If a User is holding onto a matador cape, and a virtual bull runs through the cape, the User would feel the pull of the cape against the Cursor as the bull pulls on the cape.
  • the User might also feel the force adjusted if the cape were to rip, the feeling of the cape being pulled out of the Character's hands, the cape getting caught in the bull's horns and being pulled harder, or any other interaction with the object being held.
  • a User might hold onto an object that is controlling the Character's movements. For example, the Character might grab onto the bumper of a moving car. Then, the User would feel the pull of the bumper relative to the movement of the Character. The feeling of pull could also be combined with other interactions such as maintaining balance.
  • a User can feel an object interact with another object that is caught by the object. For example the User can hold a net and catch a fish. The User can feel the forces of the fish flopping within the net either directly through forces applied to the Cursor, or through forces applied to the net through the fish's actions (and therefore applied to the Cursor, through its interaction with the net).
  • a User can control a held object to interact with an environment. For example, A User could hold onto a fishing pole and swing it forward to cast a line out into a lake. Then a User might feel a tugging force representing a fish nibbling at the hook. The User might pull on the fishing pole to hook the fish, and then feel the weight of the fish along the line added to the weight of the pole as the fish is reeled in. While an object is held, it can be shaken. A User can feel the changing inertia of an object that is moved back and forth at different rates. A virtual object might have liquid inside it that can be felt by a User through a force representation that varies based on how the liquid moves within the object. A User might feel an object slipping in his hands.
  • the User might feel a force as the rope is pulled. Then the User could feel a lesser force if the rope slips through his hands until the rope is grabbed tighter, at which point the rope stops sliding and the User feels a larger force again.
  • the User might also feel force variations representing the texture of an object that is sliding through his hands. An object that is held can be swung as well. The User can feel the weight of an object, and feel the force of the object change based on the relative positions of the Cursor (to which the object is attached) and the object.
  • “Released” mode In the second mode, the User is no longer directly controlling or manipulating the object. The object can then interact with the remainder of the simulated environment as any other simulated object.
  • forces and motions that are most effective for efficient and intuitive User control or an object are not the same as forces and motions that are most suitable for the computer simulation.
  • the physical constraints of the range of motion of an input device can be larger than the actual display, but smaller than the total simulated space. Accordingly, motions of the User to control an object that are directly mapped to motion in the simulated space can be ill-suited for efficient User interaction.
  • forces applicable by the user and time over which such forces are applied can be different in interacting with a haptic input device than forces and times in the computer simulation.
  • a user of a haptic input device might have only a few inches of motion to simulate shooting a basketball; while the simulation might represent a more complete range of travel for the shooting action.
  • the basketball experienced by the user might be more effective if the mass represented by the force feedback to the haptic device is larger than that in the simulation (moving a larger basketball mass a shorter distance, in this simplified example).
  • the time variable in a simulation can be different when the user is holding an object than when the object is released.
  • Sophisticated user interaction experiences can be implemented by changing such parameters (e.g., mass, time, scale).
  • the user can be provided with a sense of imparting energy to an object by applying a relatively large force; changing the time variable in the simulation when the user releases the object can interface the user's control of the object with the simulated environment in a manner that provides a unique user experience. Even if the properties of the simulated object are unchanged, the rate at which the object's position, velocity, or other characteristics can be different in the two modes.
  • the object's position and force interactions can be updated at a rate amenable to haptics interaction (e.g., 1000 Hz).
  • haptics interaction e.g. 1000 Hz
  • the object's position can be updated at a rate amenable to the simulation or to the display characteristics desired (e.g., 60 Hz).
  • the behavior of an object is simulated by determining acceleration (a) by dividing the vector sum of forces determined to be acting on the object by the mass assigned to the object.
  • the current velocity (V) of the object can be determined by adding the previously determined velocity (Vo) to the acceleration (a) times the time step (t).
  • changing the effective time (t) on transition between modes can allow effective user interaction in some applications.
  • the separation of the two modes can be especially beneficial in a game or other simulation that relies on physics-based simulations of objects within the game.
  • the physics simulation sometimes accelerated with special purpose hardware, is often designed for efficient implementation of the desired game characteristics. These are often not directly translatable to characteristics desired for efficient and intuitive haptic interactions.
  • the human-computer interface can optimize the haptic interaction; e.g., by presenting to the user's touch an object with mass, inertia, or other properties that give the user intuitive haptic control of the object. These properties can be effectively decoupled from the underlying physics simulation, so that the haptic interaction and physics simulation can be separately optimized.
  • the present invention can be especially useful in games where the user has experience or expectations in real world analogs to the game activity. For example, many users have experience throwing a football or shooting a basketball.
  • the present invention can allow the haptic control of the throwing or shooting motion to be designed such that the user experiences forces and motions building on the real world experience, while the underlying game simulation can proceed at its slower iteration rate and with its game-based physics properties.
  • Another example is a baseball game, where the speed and direction of motion of a haptic device, caused by the user in opposition to resistive forces, can be designed to provide a realistic throwing interaction.
  • the change from holding to released mode can be an occasion for acceleration, deceleration, error correction or amplification, or other changes suitable to the change in modes.

Abstract

A method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions. The method can provide for two distinct states: a “holding” state, and a “released” state. The holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable). The released state roughly corresponds to the user not holding the object. A simple example of the two states can include the holding, then throwing of a ball. While in the holding state, the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. provisional application 60/681,007, Computer Interface Methods and Apparatuses,“filed May 13, 2005, incorporated herein by reference.
  • BACKGROUND
  • The present invention relates to methods and apparatuses related to user interaction with computer-simulated objects, and more specifically to methods and apparatuses related to force feedback in user interaction with different object behavior dependent on whether or not a user is holding or controlling the object.
  • Computing technology has seen a many-fold increase in capability in recent years. Processors work at ever higher rates; memories are ever larger and always faster; mass storage is larger and cheaper every year. Computers now are essential elements in many aspects of life, and are often used to present three-dimensional worlds to users, in everything from games to scientific visualization.
  • The interface between the user and the computer has not seen the same rate of change. Screen windows, keyboard, monitor, and mouse are the standard, and have seen little change since their introduction. Many computers are purchased with great study as to processor speed, memory size, and disk space. Often, little thought is given to the human-computer interface, although most of the user's experience with the computer will be dominated by the interface (rarely does a user spend significant time waiting for a computer to calculate, while every interaction must use the human-computer interface).
  • As computers continue to increase in capability, the human-computer interface becomes increasingly important. The effective bandwidth of communication with the user will not be sufficient using only the traditional mouse and keyboard for input and monitor and speakers for output. More capable interface support will be desired to accommodate more complex and demanding applications. For example, six degree of freedom input devices, force and tactile feedback devices, three dimensional sound, and stereo or holographic displays can improve the human-computer interface.
  • As these new interface capabilities become available, new interface methods are needed to fully utilize new modes of human-computer communication enabled. Specifically, new methods of interaction can use the additional human-computer communication paths to supplement or supplant conventional communication paths, freeing up traditional keyboard input and visual feedback bandwidth. The use of force feedback, or haptics, can be especially useful in allowing a user to feel parts of the interface, reducing the need for a user to visually manage interface characteristics that can be managed by feel. Users interfacing with non-computer tasks routinely exploit the combination of visual and haptic feedback (seeing one side of a task while feeling the other); bringing this sensory combination into human-computer interfaces can make such interfaces more efficient and more intuitive for the user. Accordingly, there is a need for new methods of human-computer interfacing that make appropriate use of haptic and visual feedback.
  • As a specific example, many contemporary computer games require the user to throw or otherwise propel an object. The games typically allow the user to specify a throw by pressing a button or combination of buttons. The timing of the button press, often relative to the timing of other actions occurring in the game, controls the affect of the throw (e.g., the accuracy or distance of the throw). Some games display a slider or power bar that indicates direction or force of a throw; the user must press the appropriate button when the slider or bar is at the right value for the desired throw. The user can thereby control aspects of the throw, but not with any of the skills learned in real world throwing. Specifically, the direction of the user's hand motion and the force applied by the user near the release of the throw generally are not significant to the throwing action in the game. Also, the object being thrown is generally represented within the game independent of whether it is being held or has been released, forcing the user to adjust the control of the object to the constraints of the simulations in the game. These limitations in current approaches can produce unrealistic and difficult to learn game interaction.
  • SUMMARY OF THE INVENTION
  • The present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions. The method can provide for two distinct states: a “holding” state, and a “released” state. The holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable). The released state roughly corresponds to the user not holding the object. A simple example of the two states can include the holding, then throwing of a ball. While in the holding state, the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
  • The present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both. The present invention can also provide forces that represent both pushing and pulling the simulated object. The present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
  • The present invention can also provide a method of representing a simulated object within a three dimensional computer simulation. The object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object). The properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof. The simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
  • The present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object. When the user is holding the object, the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object. When the user indicates a release of the object, the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment. The present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
  • The advantages and features of novelty that characterize the present invention are pointed out with particularity in the claims annexed hereto and forming a part hereof. However, for a better understanding of the invention and the methods of its making and using, reference should be made to the drawings which form a further part hereof, and to the accompanying descriptive matter in which there are illustrated and described preferred embodiments of the present invention. The description below involves specific examples; those skilled in the art will appreciate other examples from the teachings herein, and combinations of the teachings of the examples.
  • DESCRIPTION OF THE INVENTION
  • The present invention can provide a method of providing user interaction with a computer representation of a simulated object, where the user can control the object in three dimensions. The method can provide for two distinct states: a “holding” state, and a “released” state. The holding state roughly corresponds to the user holding the simulated object (although other metaphors such as holding a spring attached to the object, or controlling the object at a distance can also be suitable). The released state roughly corresponds to the user not holding the object. A simple example of the two states can include the holding, then throwing of a ball. While in the holding state, the method provides force feedback to the user representation of forces that the user might experience if the user were holding an actual object. The forces are not applied when in the released state.
  • The present invention can allow the user to direct transitions from the holding state to the released state (e.g., releasing the ball at the end of the throwing motion), from the released state to the holding state (e.g., picking up a ball), or both. The present invention can also provide forces that represent both pushing and pulling the simulated object. The present invention can also accommodate different haptic and visual expectations of the user by providing different interaction of the object within a simulated space in the two modes. For example, a ball can be simulated with a large mass when being held by the user, to provide significant force feedback communication to the user. Upon release, however, the ball's mass internal to the simulation can be adjusted to a smaller value, to allow the ball to move and interact with other objects on a scale more fitting to the visual display capabilities.
  • The present invention can also provide a method of representing a simulated object within a three dimensional computer simulation. The object can be represented within the simulation with a first set of properties when the user is directly controlling the object (e.g., holding the object), and with a second set of properties when the user is not directly controlling the object (e.g., after the user releases the object). The properties different between the two sets can comprise properties of the simulation (e.g., time or gravity forces), properties of the simulated object (e.g., mass or inertia), or a combination thereof. The simulation can comprise a simulation of real-world physics interactions, such as is supported by contemporary hardware accelerators, with the physics principles, the object properties, the environment properties, or a combination thereof, changed when the user initiates or terminates direct control of the object.
  • The present invention can be applied to computer game applications, where the present invention can provide for enhanced user experience in propelling an object. When the user is holding the object, the present invention can provide for a set of object and interaction forces that optimize the user experience of holding the object. When the user indicates a release of the object, the present invention can provide for a set of object and simulation properties that optimize the simulated object's behavior within the game environment. The present invention can be applied to games such as football, basketball, bowling, darts, and soccer.
  • Various terms may be referred to herein, and a discussion of their respective meanings is presented in order to facilitate understanding of the invention. Haptics is the field that studies the sense of touch. In computing, haptics refers to giving a User a sense of touch through a Haptic Device. A Haptic Device (or Device) is the mechanism that allows a User to feel virtual objects and sensations. The forces created from a Haptic Device can be controlled through motors or any other way of transferring sensations to a User. The position of a Device typically refers to the position of a handle on the Device that is held by User. Any of the algorithms described can vary depending on where the handle of the Device is within its workspace. Haptic Devices can have any number of Degrees of Freedom (DOF), and can have a different number of DOF for tracking than for forces. For example a Haptic Device can track 3 DOF (x, y, and z), and output forces in 3 DOF (x, y, and z), in which case the tracked DOF are the same as the forces DOF. As another example, a Haptic Device can track 6 DOF (x, y, and z, and rotation about x, y, and z), but only have 3 DOF (x, y, and z), in which case the tracked DOF are a superset of the forces DOF. Additionally, any of a Device's DOF can be controlled by direct movements not relative to a fixed point in space (like a standard computer mouse), controlled by direct movements relative to a fixed point in space (like a mechanical tracker, mechanically grounded to a table it is resting on, where it can only move within a limited workspace), or controlled by forces against springs, movements around pivot points, twisting or turning a handle, or another type of limited movements (joystick, spaceball, etc).
  • A User is a person utilizing a Haptic Device to play a game or utilize some other type of application that gives a sense of touch. The User can experience a simulation or game in ways that are consistent with a Character (described below) such that the User feels, sees, and does what the Character does. The User can also have any portion or all of the interactions with the simulation be separate from the Character. For example, the User's view (i.e. what is seen on the monitor) does not have to be lined up with a Character's view (i.e. what a Character would see given the environment and the location of the Character's eyes), whether the Character is currently being controlled or not. The User can directly feel what a Character feels, through the Haptic Device (for example, the weight of picking up an object), or the User can feel a representation of what the Character feels to varying degrees (for example a haptic representation of the Character spinning in the air). A Character is a person or object controlled by a User in a video game or application. A Character can also be a first person view and representation of the User. Characters can be simple representations described only by graphics, or they can have complex characteristics such as Inverse Kinematic equations, body mass, muscles, energy, damage levels, artificial intelligence, or can represent any type of person, animal, or object in real life in varying degrees of realism. A Character can be a complex system like a human, or it can simply be a simple geometric object like a marble in a marble controlling game. Characters and their information can be described and contained on a single computer, on multiple computers, and in online environments. Characters can interact with other Characters. A Character can be controlled by the position of a Device or a Cursor, and a Character can be any Cursor or any object.
  • A Cursor is a virtual object controlled by a User controlling a Haptic Device. As the User moves the Haptic Device, the Cursor can move in some relationship to the movement of the Device. Typically, though not always, the Cursor moves proportionally to the movement of the Haptic Device along each axis (x,y,z). Those movements, however, can be scaled, rotated, or skewed or modified by any other function, and can be modified in these ways differently along different axes. For example, a Cursor can be controlled through a pivot point, where a movement of the Haptic Device to the right would move the Cursor to the left (the amount of movement can depend on a simulation of where the pivot point is located with respect to the Cursor). A Cursor can be a point, a sphere, any other geometric shape, a polygonal surface, a volumetric representation, a solids model, a spline based object, or can be described in any other mathematical way. A Cursor can also be a combination of any number of those objects. The graphical, haptic, and sound representations of a Cursor can be different from each other. For example, a Cursor can be a perfect haptic sphere, but a polygonal graphical sphere.
  • A Cursor can be controlled directly, or can be controlled through the interactions of one or more virtual objects that interact with another virtual object or other virtual objects. For example, a Haptic Device can control a point that is connected to a sphere with a spring, where the sphere is used as the Cursor. A Cursor's movements can be constrained in the visual, haptic, audio, or other sensory domain, preventing the Cursor, or its use, from moving into a specified area. Cursor movements can be constrained by objects, algorithms, or physical stops on a Device as examples. The position of a Cursor and the forces created can be modified with any type of linear or non-linear transformation (for example, scaling in the x direction). Position can be modified through transformations, and the forces created can be modified through an inverse function to modify the perception of Cursor movements, to modify objects (such as scale, rotation, etc), or to give interesting effects to the User. A Cursor can have visual, haptic, and sound representations, and any properties of any of those three Cursor modalities can be different. For example, a Cursor can have different sound, graphic, and haptic representations. A Cursor does not need to be shown visually. Also, a Cursor can vary in any of the ways described above differently at different times. For example, a Cursor can have a consistent haptic and graphic position when not touching objects, but a different haptic and graphic position when objects are touched. A Cursor can be shown graphically when preparing to perform an action (like beginning a snowboard run, beginning a golf swing, or selecting an object), and then not shown graphically when performing the action (snowboarding, swinging a golf club, or holding an object, respectively).
  • A Cursor can also be a representation of a User's interaction with a Character or an action, rather than a specific object used to touch other objects. For example, a Cursor can be the object used to implement a golf swing, and control the weight and feel of the club, even though the User never actually sees the Cursor directly. A Cursor can change haptic and graphic characteristics as a function of time, or as a function of another variable. A Cursor can be any object, any Character, or control any part of a Character or object. A Cursor can be in the shape of a human hand, foot, or any other part or whole of a human, animal, or cartoon. The shape, function, and motion of a Cursor can be related to that of an equivalent or similar object in the real world. For example, a Cursor shaped like a human hand can change wrist, hand and finger positioning in order to gesture, grasp objects, or otherwise interact with object similar to how hands interact with real objects.
  • In video games and other computer applications that simulate throwing—such as propelling an object in an intended direction using a limb—there is a need to allow the computer User to actually feel a representation of the throwing process. This haptic experience of throwing can enable the User to more realistically control a virtual object or Character, can give a simulation a greater intensity or requirement of skill, and can make game-play more enjoyable. There are a number of applications, simulations, and games that simulate throwing, including overhand throwing (football, water polo, Quidditch, knife or axe throwing, javelin, darts, etc), sideways throwing (baseball, skipping stones, etc), underhand throwing (softball, egg toss, bowling, rugby, skeeball, horseshoes etc.), heaving (throwing a heavy boulder, log toss in celtic games, pushing another Character into the air, soccer throw-in, fish toss etc), overhand pushing (basketball, throwing something onto a roof, shotput, etc.), spinning and throwing (hammer throw, bolo, lasso, discus, etc), cross body throwing (frisbee, ring toss, etc), mechanical throwing (atlatl, lacrosse, fishing cast etc), dropping an object (tossing an object down to another person, jacks, etc), flipping an object off a finger (marbles, flipping a coin etc), and skilled tossing (juggling, baton twirling, etc). All of those applications are examples of what is meant herein by “to propel” an object.
  • Conventionally, the act of throwing is simulated by pressing buttons and/or moving joysticks using video game controllers. These types of interactions focus almost solely on the timing of game controller interactions and often provide the User only visual feedback. Although the timing of movements involves learning and can give a basic sense of the task, the lack of haptic feedback and the limited control buttons and joysticks offer limits the realism and more complex learning that this operation can have with 3D force feedback interactions.
  • An interface in many games must provide the user with a method of indicating discharge of an object, for example release of a thrown ball. Conventional game interfaces use buttons or switches—unrelated to usual methods of releasing objects and consequently not a very realistic interface effect. In the real world, objects are thrown by imparting sufficient momentum to them. A haptic interface can accommodate interaction that allows intuitive release.
  • One or more force membranes can be presented to the user, where a force membrane is a region of the haptic space accessible by the user that imposes a force opposing motion toward the membrane as the user approaches the membrane. For example, a membrane placed in the direction of the intended target can discourage the user from releasing the object accidentally, but can allow intentional release by application of sufficient force by the user to exceed the membrane's threshold. As another example, consider a user throwing a football. The user brings the haptic interaction device back (as if to cock the arm back to throw) past a membrane, then pushes it forward (feeling the weight and drag of the football haptically), and if the user pushes the football forward fast enough to give it the required momentum, the football is thrown. Motion of the object in a throwing direction can be accompanied by a combination of dynamics forces and viscosity to guide the users movement. These forces can make directing the object thrown much easier. The forces related to the membrane can drop abruptly when the object is thrown, or can be decreased over time, depending on the desired interface characteristics. As examples, such a release mechanism can be used to throw balls or other objects (e.g., by pushing the object forward through a force barrier disposed between the user location and the target), to drop objects (e.g., by pushing the object downward through a force barrier between the user location and the floor), and to discharge weapons or blows (e.g., by pushing a representation of a weapon or character portion through a force barrier between the weapon or character portion and the target).
  • Other triggers can be used to effect release of the object. As an example, a membrane can apply an increasing force, and the object released when the user-applied force reaches a certain relation to the membrane's force (e.g., equals the maximum force, or is double the present force). Release can also be triggered by gesture recognition: a hand moving forward rapidly, then quickly slowing, can indicate a desired throw.
  • The direction of the object can be determined in various ways, some of which are discussed in more detail in U.S. provisional application 60/681,007, “Computer Interface Methods and Apparatuses,” filed May 13, 2005, incorporated herein by reference. As examples: the position, at release, pre-release, or both, can be used to set direction; the object can be modeled as attached by a spring to the cursor, and the direction of throw determined from the relative positions.
  • A visual indication can be combined with the haptic information to indicate status of the release; for example, an image of an arm about to throw can be displayed in connection with the image of the ball when the ball is ready to be thrown (pulled through the first membrane in the previous example). A haptic cue can also be used, for example a vibration in the device or perceptible bump in its motion.
  • Bimodal interface. The present invention can provide for two separate modes of object simulation, optionally selectable responsive to direction from the user. In a first mode, the user is directly controlling a simulated object. This can correspond to, as examples, the user holding the simulated object, the user holding a spring or other intermediate structure that in turn holds the object, or the user controlling the object at a distance (e.g., according to a magical levitation spell or a space-type tractor beam). In a second mode, the user is not directly controlling the simulated object. This can correspond to, as examples, the user having released a thrown object, the user dropping an object, some interaction with the simulated environment causing the user to lose control of the object, or the distance control being disrupted (e.g., the spell is broken). The object can be represented to the user and to the simulation with different interaction properties in the two states, where “interaction properties” that can be different are aspects of the user's perception of the object that change upon transition between the modes, including as examples the properties of the object in the simulation; the properties of the object as perceived by the user (e.g., texture, size); the rate of determining object characteristics (e.g., position or deformation); simulation properties (e.g., time scales, physics models); environment properties (e.g., gravitational constants, relative size of objects); and execution of a corresponding software program on a different thread or a different processor. The transition between the states can be at any point where the user associates the state transition with the release of the object, including simultaneous with the release of an object by the user or before or after such release.
  • “Holding” mode. In the first mode, the user can generally be allowed to hold the object, manipulate the object, and interact with the object and, through the object, with the simulated environment. After an object is touched or selected, a User can determine that the object should be held. This can be accomplished automatically (for example, an object can be automatically grabbed, when it is touched) or the object can be held or grabbed based on a User input such as a button press, a switch press, a voice input, a gesture that is recognized, or some other type of User input. When an object is held, it can have dynamics properties such as weight, inertia, momentum, or any other physical property. It can be implemented as a weight, and a User can feel the weight and reaction forces as it is moved. Objects that are held can have interactions with the environment. For example, a heavy object might need to be dragged across a virtual ground if it is too heavy to be picked up by the Character. In this case, as in real life, the User can feel the weight of the object, but that weight can be less than if the object was not touching the ground. As it is dragged across the ground, forces can be applied to the Cursor or object representing the friction or resistance of the object as it moves across the ground, bumps into things, or snags or gets caught on other virtual objects.
  • Objects that are held can also have forces applied to them (or directly to the Cursor) based on other virtual objects that interact with the object that is held. For example, a User might feed a virtual animal. As an apple is picked up the User might feel the weight of the apple, then, when a virtual horse bites the apple, the User can feel the apple being pulled and pushed, and the weight can be adjusted to reflect the material removed by the bite. Objects can be rotated while they are held. To rotate an object, a User can rotate the handle of the Device, or can modify the position of the Device so that the rotation of the object occurs based on a change in position of the Cursor.
  • Objects that are held can have other haptic characteristics. A User could hold an object that is spinning and feel the resulting forces. For example, a virtual gyroscope could create directional forces that the User would feel. A User can feel the acceleration of an object being held. For example, if a User holds a virtual firehose, the User might feel the force pushing back from the direction that firehose is spraying based on how much water is coming out, how close another virtual object is that is being sprayed, how tightly the hose is being held, how much pressure there is in the spraying, or other aspects of the simulation of the water being sprayed. If an object has its own forces, based on its representation, the User could feel them. For example, if a User is holding a popcorn popper, the User could feel the popcorn popping within it. Each individual force of a popcorn popping could be relayed to the User, or the forces could be represented through some other representation (such as a random approximation of forces that a popcorn popper would create). The forces that a User feels can have any form or representation which can represent any action or event. For example, a User might hold a virtual wand to cast spells. The User can feel any type of force through the wand, which can represent any type of spell.
  • The User can feel interactions of a virtual held object with another object that hits the object. For example, a User might feel the weight of a baseball bat, and a path constraint of how it can move. Then when the bat hits a baseball, that feeling can be felt by the User through a force applied to the bat or applied directly to the Cursor. If a User is holding onto a matador cape, and a virtual bull runs through the cape, the User would feel the pull of the cape against the Cursor as the bull pulls on the cape. The User might also feel the force adjusted if the cape were to rip, the feeling of the cape being pulled out of the Character's hands, the cape getting caught in the bull's horns and being pulled harder, or any other interaction with the object being held. A User might hold onto an object that is controlling the Character's movements. For example, the Character might grab onto the bumper of a moving car. Then, the User would feel the pull of the bumper relative to the movement of the Character. The feeling of pull could also be combined with other interactions such as maintaining balance. A User can feel an object interact with another object that is caught by the object. For example the User can hold a net and catch a fish. The User can feel the forces of the fish flopping within the net either directly through forces applied to the Cursor, or through forces applied to the net through the fish's actions (and therefore applied to the Cursor, through its interaction with the net).
  • A User can control a held object to interact with an environment. For example, A User could hold onto a fishing pole and swing it forward to cast a line out into a lake. Then a User might feel a tugging force representing a fish nibbling at the hook. The User might pull on the fishing pole to hook the fish, and then feel the weight of the fish along the line added to the weight of the pole as the fish is reeled in. While an object is held, it can be shaken. A User can feel the changing inertia of an object that is moved back and forth at different rates. A virtual object might have liquid inside it that can be felt by a User through a force representation that varies based on how the liquid moves within the object. A User might feel an object slipping in his hands. For example, if a User is pulling on a virtual rope, then the User might feel a force as the rope is pulled. Then the User could feel a lesser force if the rope slips through his hands until the rope is grabbed tighter, at which point the rope stops sliding and the User feels a larger force again. The User might also feel force variations representing the texture of an object that is sliding through his hands. An object that is held can be swung as well. The User can feel the weight of an object, and feel the force of the object change based on the relative positions of the Cursor (to which the object is attached) and the object.
  • “Released” mode. In the second mode, the User is no longer directly controlling or manipulating the object. The object can then interact with the remainder of the simulated environment as any other simulated object. In many applications, forces and motions that are most effective for efficient and intuitive User control or an object are not the same as forces and motions that are most suitable for the computer simulation. As an example, the physical constraints of the range of motion of an input device can be larger than the actual display, but smaller than the total simulated space. Accordingly, motions of the User to control an object that are directly mapped to motion in the simulated space can be ill-suited for efficient User interaction. As another example, forces applicable by the user and time over which such forces are applied can be different in interacting with a haptic input device than forces and times in the computer simulation. As a specific example, a user of a haptic input device might have only a few inches of motion to simulate shooting a basketball; while the simulation might represent a more complete range of travel for the shooting action. Also, the basketball experienced by the user might be more effective if the mass represented by the force feedback to the haptic device is larger than that in the simulation (moving a larger basketball mass a shorter distance, in this simplified example).
  • As another example, the time variable in a simulation can be different when the user is holding an object than when the object is released. Sophisticated user interaction experiences can be implemented by changing such parameters (e.g., mass, time, scale). The user can be provided with a sense of imparting energy to an object by applying a relatively large force; changing the time variable in the simulation when the user releases the object can interface the user's control of the object with the simulated environment in a manner that provides a unique user experience. Even if the properties of the simulated object are unchanged, the rate at which the object's position, velocity, or other characteristics can be different in the two modes. For example, in the holding mode, the object's position and force interactions can be updated at a rate amenable to haptics interaction (e.g., 1000 Hz). In a released state, the object's position can be updated at a rate amenable to the simulation or to the display characteristics desired (e.g., 60 Hz).
  • In many computer simulations, the behavior of an object is simulated by determining acceleration (a) by dividing the vector sum of forces determined to be acting on the object by the mass assigned to the object. The current velocity (V) of the object can be determined by adding the previously determined velocity (Vo) to the acceleration (a) times the time step (t). The position (P) of the object can be determined from the previously determined position (Po) by P=Po+Vo*t+½*a*t*t. In such a simulation, changing the effective time (t) on transition between modes can allow effective user interaction in some applications.
  • The separation of the two modes can be especially beneficial in a game or other simulation that relies on physics-based simulations of objects within the game. The physics simulation, sometimes accelerated with special purpose hardware, is often designed for efficient implementation of the desired game characteristics. These are often not directly translatable to characteristics desired for efficient and intuitive haptic interactions. By separation into two modes of interaction, the human-computer interface can optimize the haptic interaction; e.g., by presenting to the user's touch an object with mass, inertia, or other properties that give the user intuitive haptic control of the object. These properties can be effectively decoupled from the underlying physics simulation, so that the haptic interaction and physics simulation can be separately optimized.
  • The present invention can be especially useful in games where the user has experience or expectations in real world analogs to the game activity. For example, many users have experience throwing a football or shooting a basketball. The present invention can allow the haptic control of the throwing or shooting motion to be designed such that the user experiences forces and motions building on the real world experience, while the underlying game simulation can proceed at its slower iteration rate and with its game-based physics properties. Another example is a baseball game, where the speed and direction of motion of a haptic device, caused by the user in opposition to resistive forces, can be designed to provide a realistic throwing interaction. In all cases, the change from holding to released mode can be an occasion for acceleration, deceleration, error correction or amplification, or other changes suitable to the change in modes. Computer simulations of darts, bowling, and soccer throwins are also amenable for benefit from the present invention. Games with momentary contact between the user and an object can also benefit from the present invention; for example in a tennis game the time when the ball is in contact with the racket can be implemented as a holding mode; in a soccer game, the foot of a character can be holding the ball for a brief time to benefit from the present invention.
  • The particular sizes and equipment discussed above are cited merely to illustrate particular embodiments of the invention. It is contemplated that the use of the invention may involve components having different sizes and characteristics. It is intended that the scope of the invention be defined by the claims appended hereto.

Claims (35)

1. A method of providing user interaction with a computer representation of a simulated object, wherein the user controls a user input device in three dimensions, comprising:
a. Providing a “holding” interaction state, wherein the user input device is placed in correspondence with the simulated object, and applying simulated object forces to the user input device, where simulated object forces comprise forces representative of forces that would be experienced by the user if the user were interacting with an actual object with characteristics similar to the simulated object;
b. Providing a “released” interaction state, in which state simulated object forces are either not applied to the user input device or are significantly reduced from the forces that would be applied in the holding state.
2. A method as in claim 1, further comprising switching from the holding interaction state to the released interaction state responsive to direction from the user.
3. A method as in claim 2, wherein the direction from the user comprises one or more of the user activating a switch; the user issuing a voice direction; the user moving the simulated object within a simulated space to or past a defined boundary within the simulated space; the user touching a virtual object in the simulated space; the user imparting force, acceleration, or velocity to the simulated object of at least a threshold value;
4. A method as in claim 1, further comprising switching from the released interaction state to the holding interaction state responsive to direction from the user.
5. A method as in claim 3, wherein the direction from the user comprises one or more of the user causing a cursor to move within a simulated space such that the cursor approaches within a threshold distance in the simulated space of the simulated object; the user activating a hardware switch; the user moving a cursor along a defined path in a simulated space; touching a virtual object in the simulated space; applying a predetermined force to an input device; the user moving a cursor in the simulated space along a path that substantially matches the path of the simulated object; the user moving the input device along defined path; the user issuing a voice direction.
6. A method as in claim 1, wherein in the holding interaction state, the simulated object forces comprises forces representative of forces that would be experienced by the user if the user were holding an actual object with characteristics similar to the simulated object, including both pushing and pulling the object.
7. A method as in claim 1, wherein the simulated object has a first set of properties while in the holding interaction state, and a second set of properties, different from the first set, while in the released interaction state.
8. A method as in claim 1, wherein the simulated object interacts with a simulated space in a first manner when in the holding interaction state, and in a second manner, different from the first manner, when in the released interaction state.
9. A method as in claim 1, wherein in the holding interaction state, the simulated object forces communicate to the user inertia and momentum of the simulated object.
10. A method as in claim 1, further comprising presenting to the user a visible representation of the simulated object, and wherein the visible representation of the simulated object, a displayed space including the simulated object, or both, is different in the holding interaction state than in the released interaction state.
11. A method of representing a simulated object in a three dimensional computer simulation, comprising:
a. Determining whether the simulated object is being directly controlled by the user;
b. If the simulated object is being directly controlled by the user, then representing the object within the simulation according to a first set of interaction properties;
c. If the simulated object is not being directly controlled by the user, then representing the object within the simulation according to a second set of interaction properties, different from the first set.
12. A method as in claim 11, wherein the first set of interaction properties comprises a first set of object properties, and wherein the second set of interaction properties comprises a second set of object properties, different from the first set of object properties.
13. A method as in claim 12, wherein the object is represented by at least one of mass, size, gravitational constant, time step, or acceleration, which has a different value in the first set of properties than in the second set of properties.
14. A method as in claim 12, further comprising determining forces to be applied to a user controlling the simulated object, which forces are determined from the first set of properties.
15. A method as in claim 11, wherein the position of the simulated object is represented substantially by P=Po+Vo*t+½*a*t*t; and wherein t is different in the first and second interaction properties.
16. A method of simulating an object in a computer-simulated environment, wherein the object has a set of object properties and the environment has a set of environment properties, and wherein the simulation comprises determining interactions according to real-world physics principles applied to the object and environment properties, and wherein a user can control the object within the environment, wherein at least one of the physics principles, the object properties, and the environment properties is different when the user is controlling the object than when the user is not controlling the object.
17. A method of allowing a user to propel an object in a computer game using a force-capable input device, comprising:
a. Moving the object within the game responsive to user motion of or force applied to the input device;
b. Communicating forces to the user via the input device, which forces are representative of the object's simulated properties, which properties include at least one of mass, acceleration, gravitational force, wind resistance, acceleration;
c. Accepting a “release” indication from the user, and then
d. Moving the object within the game according to the object's simulated energy, position, velocity, acceleration, or a combination thereof, near the time of the release indication, and according to interaction properties, at least one of which is different after the release indication than before the release indication.
18. A method as in claim 17, wherein the object comprises a football in a computer football game.
19. A method as in claim 17, wherein the object comprises a basketball in a computer basketball game.
20. A method as in claim 17, wherein the object comprises a bowling ball in a computer bowling game.
21. A method as in claim 17, wherein the object comprises a dart in a computer dart game.
22. A method as in claim 17, wherein the object comprises a baseball in a computer baseball game.
23. A method as in claim 17, wherein the object comprises a soccer ball in a computer soccer game.
24. A method as in claim 17, wherein the release indication comprises motion of the object to a boundary within the game.
25. A method as in claim 17, further comprising displaying to the user a visible representation of the object and the game, and wherein the visible representation is different after the release indication than before the release indication.
26. A method as in claim 17, wherein at least one of the different property or properties is different in a manner that encourages the propelled object more toward a desired result than would have been the case had the property not been different.
27. A method as in claim 1, wherein the position, velocity, attendant forces, or a combination thereof are determined at a first rate when in the holding mode, and at a second rate, slower than the first rate, when in the released mode.
28. A method as in claim 27, wherein the direction of motion of the simulated object is changed on transition from the holding state to the released state.
29. A method as in claim 1, wherein the at least one of the object's properties in the simulation, the simulation properties, or the simulation time is different in the released state than in the holding state.
30. A method as in claim 27, wherein the direction of the object's velocity is different in a manner that encourages the object toward a goal direction or destination.
31. A method as in claim 1, wherein at least one object property, simulation property, or environment property is different in the holding state than in the released state.
32. A method as in claim 1, wherein the simulated object forces are determined from a first set of object, simulation, and environment properties; and the simulated object's motion within a simulated environment is determined from a second set of object, simulation, and environment properties; wherein at least one of the object, simulation, and environment properties in the second set is different than in the first set.
33. A method as in claim 15, wherein at least one of Po, Vo, and a is different in the released mode than in the holding mode.
34. A method as in claim 4, wherein at least one of the direction of motion, the velocity, or the acceleration of the simulated object is changed on transition from the holding state to the released state.
35. A method as in claim 2, wherein the holding state is entered responsive to a button pressed by the user, and the released state is entered responsive to the user releasing the button.
US11/433,173 2005-05-13 2006-05-13 Bimodal user interaction with a simulated object Abandoned US20060277466A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/433,173 US20060277466A1 (en) 2005-05-13 2006-05-13 Bimodal user interaction with a simulated object
PCT/US2006/042557 WO2007133251A2 (en) 2006-05-13 2006-10-30 Bimodal user interaction with a simulated object
US12/783,386 US9804672B2 (en) 2005-05-13 2010-05-19 Human-computer user interaction

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US68100705P 2005-05-13 2005-05-13
US11/433,173 US20060277466A1 (en) 2005-05-13 2006-05-13 Bimodal user interaction with a simulated object

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/783,386 Continuation US9804672B2 (en) 2005-05-13 2010-05-19 Human-computer user interaction

Publications (1)

Publication Number Publication Date
US20060277466A1 true US20060277466A1 (en) 2006-12-07

Family

ID=38694353

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/433,173 Abandoned US20060277466A1 (en) 2005-05-13 2006-05-13 Bimodal user interaction with a simulated object
US12/783,386 Active 2028-01-16 US9804672B2 (en) 2005-05-13 2010-05-19 Human-computer user interaction

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/783,386 Active 2028-01-16 US9804672B2 (en) 2005-05-13 2010-05-19 Human-computer user interaction

Country Status (2)

Country Link
US (2) US20060277466A1 (en)
WO (1) WO2007133251A2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090315839A1 (en) * 2008-06-24 2009-12-24 Microsoft Corporation Physics simulation-based interaction for surface computing
US20100148999A1 (en) * 2008-12-16 2010-06-17 Casparian Mark A Keyboard with user configurable granularity scales for pressure sensitive keys
US20110095877A1 (en) * 2008-12-16 2011-04-28 Casparian Mark A Apparatus and methods for mounting haptics actuation circuitry in keyboards
US20110102326A1 (en) * 2008-12-16 2011-05-05 Casparian Mark A Systems and methods for implementing haptics for pressure sensitive keyboards
US8005656B1 (en) * 2008-02-06 2011-08-23 Ankory Ran Apparatus and method for evaluation of design
US8700829B2 (en) 2011-09-14 2014-04-15 Dell Products, Lp Systems and methods for implementing a multi-function mode for pressure sensitive sensors and keyboards
US20140104320A1 (en) * 2012-10-17 2014-04-17 Perceptive Pixel, Inc. Controlling Virtual Objects
US8711011B2 (en) 2008-12-16 2014-04-29 Dell Products, Lp Systems and methods for implementing pressure sensitive keyboards
US20150169174A1 (en) * 2013-12-18 2015-06-18 Dassault Systemes DELMIA Corp. Posture Creation With Tool Pickup
US9111005B1 (en) 2014-03-13 2015-08-18 Dell Products Lp Systems and methods for configuring and controlling variable pressure and variable displacement sensor operations for information handling systems
US9343248B2 (en) 2013-08-29 2016-05-17 Dell Products Lp Systems and methods for implementing spring loaded mechanical key switches with variable displacement sensing
US9368300B2 (en) 2013-08-29 2016-06-14 Dell Products Lp Systems and methods for lighting spring loaded mechanical key switches
US10025099B2 (en) 2015-06-10 2018-07-17 Microsoft Technology Licensing, Llc Adjusted location hologram display
US20180229078A1 (en) * 2011-08-29 2018-08-16 Icuemotion Llc Inertial sensor motion tracking and stroke analysis system
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
CN108885523A (en) * 2016-03-31 2018-11-23 索尼公司 Information processing equipment, display control method and program
US10668353B2 (en) 2014-08-11 2020-06-02 Icuemotion Llc Codification and cueing system for sport and vocational activities
US10854104B2 (en) 2015-08-28 2020-12-01 Icuemotion Llc System for movement skill analysis and skill augmentation and cueing
CN113420031A (en) * 2021-06-30 2021-09-21 中国航空油料有限责任公司 Aviation fuel data issuing system and aviation fuel data issuing method
WO2022048403A1 (en) * 2020-09-01 2022-03-10 魔珐(上海)信息科技有限公司 Virtual role-based multimodal interaction method, apparatus and system, storage medium, and terminal
CN114433506A (en) * 2022-04-11 2022-05-06 北京霍里思特科技有限公司 Sorting machine
WO2023092229A1 (en) * 2021-11-26 2023-06-01 Lululemon Athletica Canada Inc. Method and system to provide multisensory digital interaction experiences
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball

Families Citing this family (174)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7834855B2 (en) 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
KR100556503B1 (en) * 2002-11-26 2006-03-03 엘지전자 주식회사 Control Method of Drying Time for Dryer
US10697996B2 (en) * 2006-09-26 2020-06-30 Nintendo Co., Ltd. Accelerometer sensing and object control
US20080125224A1 (en) * 2006-09-26 2008-05-29 Pollatsek David Method and apparatus for controlling simulated in flight realistic and non realistic object effects by sensing rotation of a hand-held controller
US20080207330A1 (en) * 2007-02-26 2008-08-28 Theodore Beale Artificial player character for massive multi-player on-line game
US8326442B2 (en) * 2007-05-25 2012-12-04 International Business Machines Corporation Constrained navigation in a three-dimensional (3D) virtual arena
JP5014898B2 (en) * 2007-06-29 2012-08-29 Thk株式会社 Steering for drive simulator and drive simulator
WO2009067654A1 (en) * 2007-11-21 2009-05-28 Edda Technology, Inc. Method and system for interactive percutaneous pre-operation surgical planning
US11264139B2 (en) 2007-11-21 2022-03-01 Edda Technology, Inc. Method and system for adjusting interactive 3D treatment zone for percutaneous treatment
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
EP2244797A4 (en) * 2008-01-17 2011-06-15 Vivox Inc Scalable techniques for providing real-lime per-avatar streaming data in virtual reality systems thai employ per-avatar rendered environments
JP2009194597A (en) * 2008-02-14 2009-08-27 Sony Corp Transmission and reception system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium
US8350843B2 (en) * 2008-03-13 2013-01-08 International Business Machines Corporation Virtual hand: a new 3-D haptic interface and system for virtual environments
US8203529B2 (en) * 2008-03-13 2012-06-19 International Business Machines Corporation Tactile input/output device and system to represent and manipulate computer-generated surfaces
JP2009271909A (en) * 2008-04-08 2009-11-19 Canon Inc Graphics rendering editing system, graphics rendering editing device, and editing method
US8749495B2 (en) * 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device
US8957835B2 (en) 2008-09-30 2015-02-17 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US8310447B2 (en) * 2008-11-24 2012-11-13 Lsi Corporation Pointing device housed in a writing device
US8294047B2 (en) * 2008-12-08 2012-10-23 Apple Inc. Selective input signal rejection and modification
KR101531363B1 (en) * 2008-12-10 2015-07-06 삼성전자주식회사 Method of controlling virtual object or view point on two dimensional interactive display
US9489131B2 (en) * 2009-02-05 2016-11-08 Apple Inc. Method of presenting a web page for accessibility browsing
US8364314B2 (en) * 2009-04-30 2013-01-29 GM Global Technology Operations LLC Method and apparatus for automatic control of a humanoid robot
US20100306825A1 (en) 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US9710097B2 (en) * 2009-07-10 2017-07-18 Adobe Systems Incorporated Methods and apparatus for natural media painting using touch-and-stylus combination gestures
US9696842B2 (en) * 2009-10-06 2017-07-04 Cherif Algreatly Three-dimensional cube touchscreen with database
US20110096036A1 (en) * 2009-10-23 2011-04-28 Mcintosh Jason Method and device for an acoustic sensor switch
JP5269745B2 (en) * 2009-10-30 2013-08-21 任天堂株式会社 Object control program, object control apparatus, object control system, and object control method
US9678508B2 (en) * 2009-11-16 2017-06-13 Flanders Electric Motor Service, Inc. Systems and methods for controlling positions and orientations of autonomous vehicles
WO2011070464A2 (en) * 2009-12-10 2011-06-16 Koninklijke Philips Electronics N.V. A system for rapid and accurate quantitative assessment of traumatic brain injury
EP4318463A3 (en) * 2009-12-23 2024-02-28 Google LLC Multi-modal input on an electronic device
US11416214B2 (en) * 2009-12-23 2022-08-16 Google Llc Multi-modal input on an electronic device
WO2011100220A1 (en) 2010-02-09 2011-08-18 The Trustees Of The University Of Pennsylvania Systems and methods for providing vibration feedback in robotic systems
US8683367B1 (en) * 2010-02-24 2014-03-25 The Boeing Company Method and system for differentiating among a plurality of design expressions
JP5087101B2 (en) * 2010-03-31 2012-11-28 株式会社バンダイナムコゲームス Program, information storage medium, and image generation system
KR101640043B1 (en) * 2010-04-14 2016-07-15 삼성전자주식회사 Method and Apparatus for Processing Virtual World
US9557814B2 (en) * 2010-04-22 2017-01-31 Sony Interactive Entertainment Inc. Biometric interface for a handheld device
US10217264B2 (en) * 2010-06-01 2019-02-26 Vladimir Vaganov 3D digital painting
US10922870B2 (en) * 2010-06-01 2021-02-16 Vladimir Vaganov 3D digital painting
US8966400B2 (en) * 2010-06-07 2015-02-24 Empire Technology Development Llc User movement interpretation in computer generated reality
US20120007808A1 (en) 2010-07-08 2012-01-12 Disney Enterprises, Inc. Interactive game pieces using touch screen devices for toy play
US9274641B2 (en) * 2010-07-08 2016-03-01 Disney Enterprises, Inc. Game pieces for use with touch screen devices and related methods
US8988445B2 (en) * 2010-07-30 2015-03-24 The Trustees Of The University Of Pennsylvania Systems and methods for capturing and recreating the feel of surfaces
US8282454B2 (en) 2010-09-29 2012-10-09 Nintendo Co., Ltd. Video game systems and methods including moving the protected character based with the movement of unprotected game character(s)
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
US9218316B2 (en) 2011-01-05 2015-12-22 Sphero, Inc. Remotely controlling a self-propelled device in a virtualized environment
US9429940B2 (en) 2011-01-05 2016-08-30 Sphero, Inc. Self propelled device with magnetic coupling
US8751063B2 (en) 2011-01-05 2014-06-10 Orbotix, Inc. Orienting a user interface of a controller for operating a self-propelled device
US9090214B2 (en) 2011-01-05 2015-07-28 Orbotix, Inc. Magnetically coupled accessory for a self-propelled device
US10281915B2 (en) 2011-01-05 2019-05-07 Sphero, Inc. Multi-purposed self-propelled device
WO2012103241A1 (en) * 2011-01-28 2012-08-02 Yair Greenberg Guided contact and movement response generating article and method
US9990856B2 (en) 2011-02-08 2018-06-05 The Trustees Of The University Of Pennsylvania Systems and methods for providing vibration feedback in robotic systems
US9013264B2 (en) 2011-03-12 2015-04-21 Perceptive Devices, Llc Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
US20120244969A1 (en) 2011-03-25 2012-09-27 May Patents Ltd. System and Method for a Motion Sensing Device
TWI515603B (en) * 2011-03-28 2016-01-01 緯創資通股份有限公司 Device and method of touch control feedback and touch control display device using the same
JP5498437B2 (en) * 2011-05-25 2014-05-21 株式会社ソニー・コンピュータエンタテインメント Information processing device, information processing method, information processing program, computer-readable recording medium storing information processing program, thickness region setting device, thickness region setting method, thickness region setting program, computer readable storing thickness region setting program Data structure related to various recording media and surfaces in virtual space
FR2976700B1 (en) * 2011-06-17 2013-07-12 Inst Nat Rech Inf Automat METHOD FOR GENERATING COMMAND COORDINATION CONTROL ORDERS FOR DISPLACING AN ANIMATED PLATFORM AND CORRESPONDING GENERATOR.
US9498720B2 (en) * 2011-09-30 2016-11-22 Microsoft Technology Licensing, Llc Sharing games using personal audio/visual apparatus
US9248372B2 (en) * 2011-10-05 2016-02-02 Wargaming.Net Llp Using and exporting experience gained in a video game
US9289685B2 (en) * 2011-11-18 2016-03-22 Verizon Patent And Licensing Inc. Method and system for providing virtual throwing of objects
KR102054370B1 (en) * 2011-11-23 2019-12-12 삼성전자주식회사 Haptic feedback method and apparatus, machine-readable storage medium and portable communication terminal
US20130198625A1 (en) * 2012-01-26 2013-08-01 Thomas G Anderson System For Generating Haptic Feedback and Receiving User Inputs
EP2812830B1 (en) * 2012-02-06 2018-01-10 Hothead Games Inc. Virtual opening of boxes and packs of cards
EP2812088A4 (en) 2012-02-06 2015-05-20 Hothead Games Inc Virtual competitive group management systems and methods
CN102662567B (en) * 2012-03-23 2016-01-13 腾讯科技(深圳)有限公司 The method and apparatus of operation in trigger web pages
US8698746B1 (en) * 2012-04-24 2014-04-15 Google Inc. Automatic calibration curves for a pointing device
US9183676B2 (en) * 2012-04-27 2015-11-10 Microsoft Technology Licensing, Llc Displaying a collision between real and virtual objects
US20130293580A1 (en) 2012-05-01 2013-11-07 Zambala Lllp System and method for selecting targets in an augmented reality environment
US20150087416A1 (en) * 2012-05-08 2015-03-26 Capcom Co., Ltd. Game program, game apparatus and game system
US9292758B2 (en) 2012-05-14 2016-03-22 Sphero, Inc. Augmentation of elements in data content
WO2013173389A1 (en) 2012-05-14 2013-11-21 Orbotix, Inc. Operating a computing device by detecting rounded objects in an image
US9827487B2 (en) 2012-05-14 2017-11-28 Sphero, Inc. Interactive augmented reality using a self-propelled device
JP6392747B2 (en) * 2012-05-31 2018-09-19 ノキア テクノロジーズ オサケユイチア Display device
US20140002336A1 (en) * 2012-06-27 2014-01-02 Greg D. Kaine Peripheral device for visual and/or tactile feedback
KR101398086B1 (en) * 2012-07-06 2014-05-30 (주)위메이드엔터테인먼트 Method for processing user gesture input in online game
EP3007039B1 (en) * 2012-07-13 2018-12-05 Sony Depthsensing Solutions SA/NV Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
US10056791B2 (en) 2012-07-13 2018-08-21 Sphero, Inc. Self-optimizing power transfer
US8902159B1 (en) 2012-07-24 2014-12-02 John Matthews Ergonomic support apparatus having situational sensory augmentation
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
AU2013296278B2 (en) * 2012-08-03 2018-06-14 Stryker Corporation Systems and methods for robotic surgery
US8831840B2 (en) 2012-08-31 2014-09-09 United Video Properties, Inc. Methods and systems for producing the environmental conditions of a media asset in a vehicle
US9621602B2 (en) * 2012-11-27 2017-04-11 Facebook, Inc. Identifying and providing physical social actions to a social networking system
US10346025B2 (en) * 2013-02-05 2019-07-09 Microsoft Technology Licensing, Llc Friction field for fluid margin panning in a webpage
US20180046265A1 (en) * 2013-06-06 2018-02-15 Idhl Holdings, Inc. Latency Masking Systems and Methods
US20160310753A1 (en) * 2013-06-21 2016-10-27 Brian Bravo Body tuner system
US9101838B2 (en) 2013-07-26 2015-08-11 David J. Dascher Dual pivot game controller
US20150041554A1 (en) * 2013-08-06 2015-02-12 Watergush, Inc. Social media fountain
GB201315228D0 (en) * 2013-08-27 2013-10-09 Univ London Queen Mary Control methods for expressive musical performance from a keyboard or key-board-like interface
US9411796B2 (en) * 2013-09-04 2016-08-09 Adobe Systems Incorporated Smoothing paths in a graphical interface generated by drawing inputs
CN103455623B (en) * 2013-09-12 2017-02-15 广东电子工业研究院有限公司 Clustering mechanism capable of fusing multilingual literature
US20150077345A1 (en) * 2013-09-16 2015-03-19 Microsoft Corporation Simultaneous Hover and Touch Interface
US9208765B1 (en) 2013-09-18 2015-12-08 American Megatrends, Inc. Audio visual presentation with three-dimensional display devices
US9411511B1 (en) * 2013-09-19 2016-08-09 American Megatrends, Inc. Three-dimensional display devices with out-of-screen virtual keyboards
US9335831B2 (en) * 2013-10-14 2016-05-10 Adaptable Keys A/S Computer keyboard including a control unit and a keyboard screen
US8825492B1 (en) * 2013-10-28 2014-09-02 Yousef A. E. S. M. Buhadi Language-based video game
US10168873B1 (en) 2013-10-29 2019-01-01 Leap Motion, Inc. Virtual interactions for machine control
JP5888309B2 (en) * 2013-10-31 2016-03-22 カシオ計算機株式会社 Training support apparatus and system, form analysis apparatus and method, and program
US9996797B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
US10416834B1 (en) 2013-11-15 2019-09-17 Leap Motion, Inc. Interaction strength using virtual objects for machine control
US9829882B2 (en) 2013-12-20 2017-11-28 Sphero, Inc. Self-propelled device with center of mass drive system
EP2891950B1 (en) 2014-01-07 2018-08-15 Sony Depthsensing Solutions Human-to-computer natural three-dimensional hand gesture based navigation method
US20160348341A1 (en) * 2014-01-27 2016-12-01 Volvo Construction Equipment Ab Outrigger and dozer control using gui
JP2015166890A (en) * 2014-03-03 2015-09-24 ソニー株式会社 Information processing apparatus, information processing system, information processing method, and program
CN104134226B (en) * 2014-03-12 2015-08-19 腾讯科技(深圳)有限公司 Speech simulation method, device and client device in a kind of virtual scene
GB201409764D0 (en) 2014-06-02 2014-07-16 Accesso Technology Group Plc Queuing system
US11900734B2 (en) 2014-06-02 2024-02-13 Accesso Technology Group Plc Queuing system
US9919208B2 (en) * 2014-12-11 2018-03-20 Immersion Corporation Video gameplay haptics
US10768704B2 (en) 2015-03-17 2020-09-08 Whirlwind VR, Inc. System and method for modulating a peripheral device based on an unscripted feed using computer vision
US9707680B1 (en) 2015-05-28 2017-07-18 X Development Llc Suggesting, selecting, and applying task-level movement parameters to implementation of robot motion primitives
US10101157B2 (en) 2015-09-14 2018-10-16 Eric Bharucha Free-space force feedback system
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
US9846971B2 (en) * 2016-01-19 2017-12-19 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon
US9971408B2 (en) * 2016-01-27 2018-05-15 Ebay Inc. Simulating touch in a virtual environment
US11663783B2 (en) 2016-02-10 2023-05-30 Disney Enterprises, Inc. Systems and methods for using augmented reality with the internet of things
AU2017227708A1 (en) 2016-03-01 2018-10-18 ARIS MD, Inc. Systems and methods for rendering immersive environments
US10587834B2 (en) 2016-03-07 2020-03-10 Disney Enterprises, Inc. Systems and methods for tracking objects for augmented reality
JP2017170586A (en) * 2016-03-25 2017-09-28 セイコーエプソン株式会社 End effector, robot and robot control device
JP6382884B2 (en) * 2016-04-26 2018-08-29 日本電信電話株式会社 Estimation apparatus, estimation method, and program
US9919213B2 (en) 2016-05-03 2018-03-20 Hothead Games Inc. Zoom controls for virtual environment user interfaces
US10004991B2 (en) 2016-06-28 2018-06-26 Hothead Games Inc. Systems and methods for customized camera views in virtualized environments
US10010791B2 (en) 2016-06-28 2018-07-03 Hothead Games Inc. Systems and methods for customized camera views and customizable objects in virtualized environments
US10905956B2 (en) 2016-06-28 2021-02-02 Rec Room Inc. Systems and methods providing temporary decoupling of user avatar synchronicity for presence enhancing experiences
US20180046352A1 (en) * 2016-08-09 2018-02-15 Matthew Johnson Virtual cursor movement
USD852210S1 (en) * 2016-08-24 2019-06-25 Beijing Kingsoft Internet Security Software Co., Ltd. Mobile communication terminal with graphical user interface
USD852209S1 (en) * 2016-08-24 2019-06-25 Beijing Kingsoft Internet Security Software Co., Ltd. Mobile communication terminal with animated graphical user interface
CN106446452B (en) * 2016-10-19 2019-03-29 南京信息工程大学 The modeling method of arched girder spring deformation model based on power haptic interaction
US10493363B2 (en) * 2016-11-09 2019-12-03 Activision Publishing, Inc. Reality-based video game elements
US11205103B2 (en) 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis
WO2018112112A1 (en) * 2016-12-13 2018-06-21 DeepMotion, Inc. Improved virtual reality system using multiple force arrays for a solver
US10732714B2 (en) 2017-05-08 2020-08-04 Cirrus Logic, Inc. Integrated haptic system
CN110869879A (en) * 2017-06-30 2020-03-06 雷蛇(亚太)私人有限公司 Adjustable haptic feedback using force sensors and haptic actuators
CN107330858B (en) * 2017-06-30 2020-12-04 北京乐蜜科技有限责任公司 Picture processing method and device, electronic equipment and storage medium
US10375930B1 (en) * 2017-07-07 2019-08-13 Chad R. James Animal training device that controls stimulus using proportional pressure-based input
US11259121B2 (en) 2017-07-21 2022-02-22 Cirrus Logic, Inc. Surface speaker
US10481680B2 (en) 2018-02-02 2019-11-19 Disney Enterprises, Inc. Systems and methods to provide a shared augmented reality experience
US20190294249A1 (en) * 2018-03-21 2019-09-26 JANUS Research Group, Inc. Systems and methods for haptic feedback in a virtual reality system
US10546431B2 (en) 2018-03-29 2020-01-28 Disney Enterprises, Inc. Systems and methods to augment an appearance of physical object for an augmented reality experience
US10832537B2 (en) 2018-04-04 2020-11-10 Cirrus Logic, Inc. Methods and apparatus for outputting a haptic signal to a haptic transducer
US11023669B2 (en) 2018-06-29 2021-06-01 Microsoft Technology Licensing, Llc Rendering lambda functions in spreadsheet applications
US10699068B2 (en) 2018-06-29 2020-06-30 Microsoft Technology Licensing, Llc Distribution of lambda functions
US10726201B1 (en) 2018-06-29 2020-07-28 Microsoft Technology Licensing, Llc Creating and handling lambda functions in spreadsheet applications
US11423116B2 (en) * 2018-06-29 2022-08-23 Microsoft Technology Licensing, Llc Automatically creating lambda functions in spreadsheet applications
WO2020026301A1 (en) * 2018-07-30 2020-02-06 株式会社ソニー・インタラクティブエンタテインメント Game device, and golf game control method
US11269415B2 (en) 2018-08-14 2022-03-08 Cirrus Logic, Inc. Haptic output systems
US11312015B2 (en) * 2018-09-10 2022-04-26 Reliabotics LLC System and method for controlling the contact pressure applied by an articulated robotic arm to a working surface
US10974132B2 (en) 2018-10-02 2021-04-13 Disney Enterprises, Inc. Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies
US11307730B2 (en) * 2018-10-19 2022-04-19 Wen-Chieh Geoffrey Lee Pervasive 3D graphical user interface configured for machine learning
GB201817495D0 (en) 2018-10-26 2018-12-12 Cirrus Logic Int Semiconductor Ltd A force sensing system and method
US11014008B2 (en) 2019-03-27 2021-05-25 Disney Enterprises, Inc. Systems and methods for game profile development based on virtual and/or real activities
CN109955225A (en) * 2019-03-28 2019-07-02 东南大学 A kind of force feedback hand controlled device of parallel Three Degree Of Freedom and its control method
US10726683B1 (en) 2019-03-29 2020-07-28 Cirrus Logic, Inc. Identifying mechanical impedance of an electromagnetic load using a two-tone stimulus
US11509292B2 (en) 2019-03-29 2022-11-22 Cirrus Logic, Inc. Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter
US10955955B2 (en) 2019-03-29 2021-03-23 Cirrus Logic, Inc. Controller for use in a device comprising force sensors
US11283337B2 (en) 2019-03-29 2022-03-22 Cirrus Logic, Inc. Methods and systems for improving transducer dynamics
US10992297B2 (en) 2019-03-29 2021-04-27 Cirrus Logic, Inc. Device comprising force sensors
US10828672B2 (en) 2019-03-29 2020-11-10 Cirrus Logic, Inc. Driver circuitry
US11644370B2 (en) 2019-03-29 2023-05-09 Cirrus Logic, Inc. Force sensing with an electromagnetic load
US11004247B2 (en) 2019-04-02 2021-05-11 Adobe Inc. Path-constrained drawing with visual properties based on drawing tool
DK180359B1 (en) 2019-04-15 2021-02-03 Apple Inc Accelerated scrolling and selection
US10916061B2 (en) 2019-04-24 2021-02-09 Disney Enterprises, Inc. Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content
CA3044587C (en) * 2019-05-28 2023-04-25 Square Enix Ltd. Control of player character with enhanced motion functionality
US10976825B2 (en) 2019-06-07 2021-04-13 Cirrus Logic, Inc. Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system
CN114008569A (en) 2019-06-21 2022-02-01 思睿逻辑国际半导体有限公司 Method and apparatus for configuring a plurality of virtual buttons on a device
US11216150B2 (en) 2019-06-28 2022-01-04 Wen-Chieh Geoffrey Lee Pervasive 3D graphical user interface with vector field functionality
US10918949B2 (en) 2019-07-01 2021-02-16 Disney Enterprises, Inc. Systems and methods to provide a sports-based interactive experience
TWI716964B (en) * 2019-08-09 2021-01-21 天下數位科技股份有限公司 Auxiliary card pressing structure of shuffler
US11408787B2 (en) * 2019-10-15 2022-08-09 Cirrus Logic, Inc. Control methods for a force sensor system
CN110794385B (en) * 2019-10-18 2021-07-13 北京空间机电研究所 Method and system for evaluating zero gravity pointing of laser
US11380175B2 (en) 2019-10-24 2022-07-05 Cirrus Logic, Inc. Reproducibility of haptic waveform
US11545951B2 (en) 2019-12-06 2023-01-03 Cirrus Logic, Inc. Methods and systems for detecting and managing amplifier instability
US11662821B2 (en) 2020-04-16 2023-05-30 Cirrus Logic, Inc. In-situ monitoring, calibration, and testing of a haptic actuator
US11933822B2 (en) 2021-06-16 2024-03-19 Cirrus Logic Inc. Methods and systems for in-system estimation of actuator parameters
US11908310B2 (en) 2021-06-22 2024-02-20 Cirrus Logic Inc. Methods and systems for detecting and managing unexpected spectral content in an amplifier system
US11765499B2 (en) 2021-06-22 2023-09-19 Cirrus Logic Inc. Methods and systems for managing mixed mode electromechanical actuator drive
US11552649B1 (en) 2021-12-03 2023-01-10 Cirrus Logic, Inc. Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5741182A (en) * 1994-06-17 1998-04-21 Sports Sciences, Inc. Sensing spatial movement
US5833549A (en) * 1995-11-14 1998-11-10 Interactive Light, Inc. Sports trainer and game
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
US6033227A (en) * 1996-02-26 2000-03-07 Nec Corporation Training apparatus
US6343987B2 (en) * 1996-11-07 2002-02-05 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method and recording medium
US20020021277A1 (en) * 2000-04-17 2002-02-21 Kramer James F. Interface for controlling a graphical image
US20020034980A1 (en) * 2000-08-25 2002-03-21 Thomas Lemmons Interactive game via set top boxes
US6366272B1 (en) * 1995-12-01 2002-04-02 Immersion Corporation Providing interactions between simulated objects using force feedback
US6402154B1 (en) * 2000-12-15 2002-06-11 Michael Hess Simulated football game
US20040053686A1 (en) * 2002-09-12 2004-03-18 Pacey Larry J. Gaming machine performing real-time 3D rendering of gaming events
US6712692B2 (en) * 2002-01-03 2004-03-30 International Business Machines Corporation Using existing videogames for physical training and rehabilitation
US6722888B1 (en) * 1995-01-20 2004-04-20 Vincent J. Macri Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US20040219980A1 (en) * 2003-04-30 2004-11-04 Nintendo Co., Ltd. Method and apparatus for dynamically controlling camera parameters based on game play events
US20050250083A1 (en) * 1997-10-06 2005-11-10 Macri Vincent J Method and apparatus for instructors to develop pre-training lessons using controllable images
US20070060337A1 (en) * 2005-08-19 2007-03-15 Aruze Corp. Game program and game system
US20070134639A1 (en) * 2005-12-13 2007-06-14 Jason Sada Simulation process with user-defined factors for interactive user training
US7259761B2 (en) * 1998-07-17 2007-08-21 Sensable Technologies, Inc. Systems and methods for sculpting virtual objects in a haptic virtual reality environment
US7544137B2 (en) * 2003-07-30 2009-06-09 Richardson Todd E Sports simulation system

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4302190A (en) * 1979-12-19 1981-11-24 The United States Of America As Represented By The Secretary Of The Navy Rifle recoil simulator
US4380437A (en) * 1981-09-04 1983-04-19 Yarborough Jr G Wirth Small weapons simulator
US4909260A (en) * 1987-12-03 1990-03-20 American Health Products, Inc. Portable belt monitor of physiological functions and sensors therefor
US5232223A (en) * 1992-03-24 1993-08-03 Larry Dornbusch Electronic game controller
US5403192A (en) * 1993-05-10 1995-04-04 Cae-Link Corporation Simulated human lung for anesthesiology simulation
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US5542672A (en) 1995-03-17 1996-08-06 Meredith; Chris Fishing rod and reel electronic game controller
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6155926A (en) * 1995-11-22 2000-12-05 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control
US6219032B1 (en) * 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6374255B1 (en) * 1996-05-21 2002-04-16 Immersion Corporation Haptic authoring
US5921780A (en) * 1996-06-28 1999-07-13 Myers; Nicole J. Racecar simulator and driver training system and method
IL119463A (en) * 1996-10-21 2000-08-31 Kwalwasser Yaakov Recoil simulator for a weapon
JP3882287B2 (en) * 1997-03-07 2007-02-14 株式会社セガ Fishing equipment
US6421048B1 (en) * 1998-07-17 2002-07-16 Sensable Technologies, Inc. Systems and methods for interacting with virtual objects in a haptic virtual reality environment
EP1173257A1 (en) * 1999-04-01 2002-01-23 Dominic Choy Simulated human interaction systems
US20020010021A1 (en) * 1999-08-03 2002-01-24 Mccauley Jack Jean Method and device for optical gun interaction with a computer game system
US20040259644A1 (en) * 1999-08-03 2004-12-23 Mccauley Jack Jean Method and device for optical gun interaction with a computer system
US6592461B1 (en) * 2000-02-04 2003-07-15 Roni Raviv Multifunctional computer interactive play system
US7445550B2 (en) * 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US6761637B2 (en) * 2000-02-22 2004-07-13 Creative Kingdoms, Llc Method of game play using RFID tracking device
US6554706B2 (en) * 2000-05-31 2003-04-29 Gerard Jounghyun Kim Methods and apparatus of displaying and evaluating motion data in a motion game apparatus
US20020119822A1 (en) * 2001-02-28 2002-08-29 Kunzle Adrian E. Systems and methods wherein a player device continues game play independent of a determination of player input validity
US7202851B2 (en) 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
JP3442754B2 (en) * 2001-08-10 2003-09-02 株式会社コナミコンピュータエンタテインメント東京 Gun shooting game apparatus, computer control method and program
US6817979B2 (en) * 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20040064298A1 (en) * 2002-09-26 2004-04-01 Robert Levine Medical instruction using a virtual patient
US20040180719A1 (en) * 2002-12-04 2004-09-16 Philip Feldman Game controller support structure and isometric exercise system and method of facilitating user exercise during game interaction
US20040211104A1 (en) * 2003-04-28 2004-10-28 Eberle Glen Richard Universal modular gunstock
US8992322B2 (en) * 2003-06-09 2015-03-31 Immersion Corporation Interactive gaming systems with haptic feedback
JP3931889B2 (en) * 2003-08-19 2007-06-20 ソニー株式会社 Image display system, image display apparatus, and image display method
US20050191601A1 (en) * 2004-02-26 2005-09-01 Vojtech Dvorak Training weapon
US7806759B2 (en) * 2004-05-14 2010-10-05 Konami Digital Entertainment, Inc. In-game interface with performance feedback
US7542040B2 (en) * 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5741182A (en) * 1994-06-17 1998-04-21 Sports Sciences, Inc. Sensing spatial movement
US5846086A (en) * 1994-07-01 1998-12-08 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
US6966778B2 (en) * 1995-01-20 2005-11-22 Vincent J. Macri Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US6722888B1 (en) * 1995-01-20 2004-04-20 Vincent J. Macri Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US5833549A (en) * 1995-11-14 1998-11-10 Interactive Light, Inc. Sports trainer and game
US6366272B1 (en) * 1995-12-01 2002-04-02 Immersion Corporation Providing interactions between simulated objects using force feedback
US7158112B2 (en) * 1995-12-01 2007-01-02 Immersion Corporation Interactions between simulated objects with force feedback
US20060030383A1 (en) * 1995-12-01 2006-02-09 Rosenberg Louis B Force feedback device for simulating combat
US6033227A (en) * 1996-02-26 2000-03-07 Nec Corporation Training apparatus
US6343987B2 (en) * 1996-11-07 2002-02-05 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method and recording medium
US20050250083A1 (en) * 1997-10-06 2005-11-10 Macri Vincent J Method and apparatus for instructors to develop pre-training lessons using controllable images
US7259761B2 (en) * 1998-07-17 2007-08-21 Sensable Technologies, Inc. Systems and methods for sculpting virtual objects in a haptic virtual reality environment
US20020021277A1 (en) * 2000-04-17 2002-02-21 Kramer James F. Interface for controlling a graphical image
US20020034980A1 (en) * 2000-08-25 2002-03-21 Thomas Lemmons Interactive game via set top boxes
US6402154B1 (en) * 2000-12-15 2002-06-11 Michael Hess Simulated football game
US6712692B2 (en) * 2002-01-03 2004-03-30 International Business Machines Corporation Using existing videogames for physical training and rehabilitation
US20040053686A1 (en) * 2002-09-12 2004-03-18 Pacey Larry J. Gaming machine performing real-time 3D rendering of gaming events
US20040219980A1 (en) * 2003-04-30 2004-11-04 Nintendo Co., Ltd. Method and apparatus for dynamically controlling camera parameters based on game play events
US7544137B2 (en) * 2003-07-30 2009-06-09 Richardson Todd E Sports simulation system
US20070060337A1 (en) * 2005-08-19 2007-03-15 Aruze Corp. Game program and game system
US20070134639A1 (en) * 2005-12-13 2007-06-14 Jason Sada Simulation process with user-defined factors for interactive user training

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8005656B1 (en) * 2008-02-06 2011-08-23 Ankory Ran Apparatus and method for evaluation of design
WO2010008680A3 (en) * 2008-06-24 2010-03-11 Microsoft Corporation Physics simulation-based interaction for surface computing
US8502795B2 (en) 2008-06-24 2013-08-06 Microsoft Corporation Physics simulation-based interaction for surface computing
US20090315839A1 (en) * 2008-06-24 2009-12-24 Microsoft Corporation Physics simulation-based interaction for surface computing
US8154524B2 (en) 2008-06-24 2012-04-10 Microsoft Corporation Physics simulation-based interaction for surface computing
CN102132253A (en) * 2008-06-24 2011-07-20 微软公司 Physics simulation-based interaction for surface computing
US20110095877A1 (en) * 2008-12-16 2011-04-28 Casparian Mark A Apparatus and methods for mounting haptics actuation circuitry in keyboards
US20110102326A1 (en) * 2008-12-16 2011-05-05 Casparian Mark A Systems and methods for implementing haptics for pressure sensitive keyboards
US20100148999A1 (en) * 2008-12-16 2010-06-17 Casparian Mark A Keyboard with user configurable granularity scales for pressure sensitive keys
US8674941B2 (en) 2008-12-16 2014-03-18 Dell Products, Lp Systems and methods for implementing haptics for pressure sensitive keyboards
US9342149B2 (en) 2008-12-16 2016-05-17 Dell Products Lp Systems and methods for implementing haptics for pressure sensitive keyboards
US9791941B2 (en) 2008-12-16 2017-10-17 Dell Products Lp Keyboard with user configurable granularity scales for pressure sensitive keys
US8711011B2 (en) 2008-12-16 2014-04-29 Dell Products, Lp Systems and methods for implementing pressure sensitive keyboards
US8760273B2 (en) 2008-12-16 2014-06-24 Dell Products, Lp Apparatus and methods for mounting haptics actuation circuitry in keyboards
US9246487B2 (en) 2008-12-16 2016-01-26 Dell Products Lp Keyboard with user configurable granularity scales for pressure sensitive keys
US20180229078A1 (en) * 2011-08-29 2018-08-16 Icuemotion Llc Inertial sensor motion tracking and stroke analysis system
US10610732B2 (en) * 2011-08-29 2020-04-07 Icuemotion Llc Inertial sensor motion tracking and stroke analysis system
US8700829B2 (en) 2011-09-14 2014-04-15 Dell Products, Lp Systems and methods for implementing a multi-function mode for pressure sensitive sensors and keyboards
US20140104320A1 (en) * 2012-10-17 2014-04-17 Perceptive Pixel, Inc. Controlling Virtual Objects
US9589538B2 (en) * 2012-10-17 2017-03-07 Perceptive Pixel, Inc. Controlling virtual objects
US9368300B2 (en) 2013-08-29 2016-06-14 Dell Products Lp Systems and methods for lighting spring loaded mechanical key switches
US9959996B2 (en) 2013-08-29 2018-05-01 Dell Products Lp Systems and methods for lighting spring loaded mechanical key switches
US9343248B2 (en) 2013-08-29 2016-05-17 Dell Products Lp Systems and methods for implementing spring loaded mechanical key switches with variable displacement sensing
US9256348B2 (en) * 2013-12-18 2016-02-09 Dassault Systemes Americas Corp. Posture creation with tool pickup
US20150169174A1 (en) * 2013-12-18 2015-06-18 Dassault Systemes DELMIA Corp. Posture Creation With Tool Pickup
US9111005B1 (en) 2014-03-13 2015-08-18 Dell Products Lp Systems and methods for configuring and controlling variable pressure and variable displacement sensor operations for information handling systems
US10668353B2 (en) 2014-08-11 2020-06-02 Icuemotion Llc Codification and cueing system for sport and vocational activities
US11455834B2 (en) 2014-08-11 2022-09-27 Icuemotion Llc Codification and cueing system for sport and vocational activities
US10025099B2 (en) 2015-06-10 2018-07-17 Microsoft Technology Licensing, Llc Adjusted location hologram display
US11367364B2 (en) 2015-08-28 2022-06-21 Icuemotion Llc Systems and methods for movement skill analysis and skill augmentation
US11763697B2 (en) 2015-08-28 2023-09-19 Icuemotion Llc User interface system for movement skill analysis and skill augmentation
US10854104B2 (en) 2015-08-28 2020-12-01 Icuemotion Llc System for movement skill analysis and skill augmentation and cueing
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
US11301031B2 (en) 2016-03-31 2022-04-12 Sony Corporation Information processing apparatus and display control method to control a display position of a virtual object
US11230375B1 (en) 2016-03-31 2022-01-25 Steven M. Hoffberg Steerable rotating projectile
CN108885523A (en) * 2016-03-31 2018-11-23 索尼公司 Information processing equipment, display control method and program
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
WO2022048403A1 (en) * 2020-09-01 2022-03-10 魔珐(上海)信息科技有限公司 Virtual role-based multimodal interaction method, apparatus and system, storage medium, and terminal
CN113420031A (en) * 2021-06-30 2021-09-21 中国航空油料有限责任公司 Aviation fuel data issuing system and aviation fuel data issuing method
WO2023092229A1 (en) * 2021-11-26 2023-06-01 Lululemon Athletica Canada Inc. Method and system to provide multisensory digital interaction experiences
CN114433506A (en) * 2022-04-11 2022-05-06 北京霍里思特科技有限公司 Sorting machine

Also Published As

Publication number Publication date
US9804672B2 (en) 2017-10-31
WO2007133251A3 (en) 2009-05-22
US20100261526A1 (en) 2010-10-14
WO2007133251A2 (en) 2007-11-22

Similar Documents

Publication Publication Date Title
US20060277466A1 (en) Bimodal user interaction with a simulated object
US8747196B2 (en) Force feedback device for simulating combat
Andrews et al. Hapticast: a physically-based 3D game with haptic feedback
US20040130525A1 (en) Dynamic touch screen amusement game controller
JP2019535347A (en) Method and system for using sensor of control device for game control
US7682250B2 (en) Method and apparatus for simulating interactive spinning bar gymnastics on a 3D display
CN105764582B (en) Game device, game system, program and recording medium
US10328339B2 (en) Input controller and corresponding game mechanics for virtual reality systems
KR20090003337A (en) Method for automatically adapting virtual equipment model
WO2007032122A1 (en) Game control program, game machine, and game control method
JP2011072731A (en) Game program, game device, and game control method
US20060199626A1 (en) In-game shot aiming indicator
JP5258710B2 (en) GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD
JP5736407B2 (en) GAME PROGRAM AND GAME DEVICE
Bozgeyikli et al. Tangiball: Dynamic embodied tangible interaction with a ball in virtual reality
TW201503938A (en) 360 degrees-surrounding virtual reality of fishing system
JP3835477B2 (en) Program for controlling execution of game and game apparatus for executing the program
Hemmingsen Movement compression, sports and eSports
JP6863678B2 (en) Program and game equipment
Richard et al. Modeling dynamic interaction in virtual environments and the evaluation of dynamic virtual fixtures
JP2014144360A (en) Game program and game device
US20040113931A1 (en) Human-computer interfaces incorporating haptics and path-based interaction
JP6945699B2 (en) Program and game equipment
Pitura Sword fighting in virtual reality: Where are we and how do we make it real
EP2175948A1 (en) A method and device for controlling a movement sequence within the course of a simulated game or sport event

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOVINT TECHNOLOGIES, INC., NEW MEXICO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ANDERSON, THOMAS G;REEL/FRAME:018544/0947

Effective date: 20061115

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION