CN103975290A - Methods and systems for gesture-based petrotechnical application control - Google Patents
Methods and systems for gesture-based petrotechnical application control Download PDFInfo
- Publication number
- CN103975290A CN103975290A CN201280045095.8A CN201280045095A CN103975290A CN 103975290 A CN103975290 A CN 103975290A CN 201280045095 A CN201280045095 A CN 201280045095A CN 103975290 A CN103975290 A CN 103975290A
- Authority
- CN
- China
- Prior art keywords
- posture
- identification
- processor
- user
- described processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Gesture-based petrotechnical application control is provided. At least some embodiments involve controlling the view of a petrotechnical application by capturing images of a user; creating a skeletal map based on the user in the images; recognizing a gesture based on the skeletal map; and implementing a command based on the recognized gesture.
Description
Cross-reference to related applications
Nothing
Background technology
From subsurface reservoir, producing hydrocarbon is a complicated operation, and this operation comprises preliminary exploration and the reservoir modeling that uses geological data.In order to increase the output from reservoir, Gas Company also can be simulated reservoir extractive technique with reservoir model, and the result based on identified realizes actual extraction subsequently.The ability of visually analyzing data has increased the extraction of useful information.Along with the development of computer technology and along with the improvement of Reservoir Modeling, this ability has caused the complicacy of reservoir modeling and the increase of accuracy.
Petro-technology application can utilize three-dimensional (3D) view of physical space to show earthquake or reservoir model to user.User is by using input equipment such as mouse and keyboard and 3D view alternately and controlling 3D view.Yet when mutual with application, using these input equipments is not intuitively for user.Thus, more directly perceived is all useful with making mutual any invention with petro-technology application glibly.
Accompanying drawing summary
For exemplary embodiment is elaborated, now with reference to respective drawings, in the accompanying drawings:
Fig. 1 illustrates according to the example user with application of some embodiment mutual.
Fig. 2 illustrates according to the example user with application of some embodiment mutual.
Fig. 3 illustrates according to the example user with application of some embodiment mutual.
Fig. 4 illustrates according to the example user with application of some embodiment mutual.
Fig. 5 illustrates according to the skeletal graph of the user's of some embodiment hand.
Fig. 6 illustrates the hardware system according to some embodiment with block diagram.
Fig. 7 illustrates according to the relation between the hardware and software of some embodiment with block diagram form.
Fig. 8 illustrates the computer system according to some embodiment with block diagram form.
Fig. 9 illustrates the method according at least some embodiment with FB(flow block) form.
Explain and name
The particular term of using in following explanation and claims refers to concrete system component.As understood in those skilled in that art, different companies can refer to assembly and/or method with different names.This document is not intended to distinguished name difference and identical assembly and/or the method for function.
In instructions and claims below, with open form, use term " to comprise " and " comprising ", and therefore should be construed to expression " including but not limited to ... ".Also have, term " coupling " or " connection " are intended to represent indirectly or directly connect.Thus, if the first device coupled to the second equipment, its connection can be by direct connection, or by the indirect connection via miscellaneous equipment and connection.
Embodiment
Explanation is below for various embodiments of the present invention.Although one or more in these embodiment may be preferred, yet the disclosed embodiments should not be interpreted as or with being restricted the scope of the present disclosure that comprises claims.In addition, it will be understood by those skilled in the art that instructions below has a wide range of applications, and the explanation of any embodiment is only the example of this embodiment, and is not intended to announce to comprise that the scope of the present disclosure of claims only limits to this embodiment.
Each embodiment relates to the control of interactive petro-technology application, and wherein controlling is by providing with physical motion or the posture of applying mutual user.Except physics posture, also can control this interactive application by the combination of physics posture and/or voice command.First instructions turns to the senior general view of the control of petro-technology application, with rear steering to realizing the details of such control.
Fig. 1 illustrates the interactive petro-technology application 108 that is subject to user's ability of posture control.User 112 is mutual with the three-dimensional views 110 being projected on the two dimensional display of applying 108.In one embodiment, diagram 110 can be the three-dimensional views that is projected to the geologic model of the hydrocarbon containing formation on two dimensional display.In another embodiment, diagram 110 can be three-dimensional views that create and that be projected in the hydrocarbon containing formation on two dimensional display based on geological data.In each embodiment, when subscriber station is in (or being sitting in) application display of 108 and system 106 when front, the image that system 106 catches users 112 is also associated this image with the skeletal graph such as bone Figure 100.Based on this illustrative bone Figure 100, system 106 is followed the tracks of the variation of body position by following the tracks of the bone interested joint identifying, and for example, from the motion (shape, speed, amplitude) of following the tracks of, determines subsequently what posture user 112 is making.System 106 realizes the order being associated with identified posture, and this order realizes in application 108.For example, user 112 can apply by order 108 mutual with diagram 110, thereby by making the posture relevant to its health, changes the view of diagram 110, such as: the view of rotational model; Zoom in or out, left and right translation or model is made change, adds or deleted.
In the particular example of Fig. 1, user's 112 use hands 102 are made circular posture 104.System 106 catches user's image, is fed into system, and this user images is associated with corresponding bone Figure 100.System can be associated as identified circular posture 104 corresponding to application 108 orders to rotate diagram 110 around its y axle, so that user 112 can check this diagram from another angle.Thus, when user 112 makes circular posture 104, the system 106 identification movement associated with bone Figure 100, and this posture motion is translated into the particular command that rotates diagram 110.In the second frame of Fig. 1, interactive application 108 is by being shown as diagram 110 through rotate posture is responded.The circular posture 104 of being made by user 112 that causes three-dimensional views 110 to be rotated around its y axle is that the posture that can identify can be made and so on an example, yet circular posture is not limited only to the order of rotary type.
Fig. 2 illustrates by control another embodiment of petro-technology application by posture.Specifically, Fig. 2 illustrates the posture that the movement of the head 200 by user 112 presents and controls by application 108 views that present.In this example, its head if user 112 is tilted to the right (as shown in the left-half of Fig. 2), the view of diagram 110 will correspondingly respond, such as by changing angle just as user the right side of three-dimensional views looking about.Equally, its head if user is tilted to the left (as shown in the right half part of Fig. 2), the view of object will correspondingly change.Causing the head inclination posture of being made by user 112 that the view of diagram 110 changes is that identified posture can be made and so on an example, yet head inclination posture is not limited only to view, changes order.
Fig. 3 illustrates the another embodiment that controls petro-technology application by the use of posture.Specifically, Fig. 3 illustrates form for the posture of the change of distance between user and application 108 displays.User can be by physically more approaching or be located farther from system 106 ground and move to make posture with order application 108 zoom degrees that change on diagram 110.For example, in Fig. 3, user 112 stands in from application 108 display distance ' ' d ' ' 300 places.By towards screen displacement " x " 302, the view of diagram 110 for example, with the proportional amount of the distance to carrying out (zoom percentage-carry out ratio of distances constant) " amplification ".If it is farther that user 112 went ahead, the view of object will further amplify.If user 112 walks to reversion, view for example, by reduced (ratio by programming between the distance based on carrying out and zoom degree).Cause changing diagram 110 convergent-divergent, made by user 112 more close and be that identified posture can be made and so on an example further from the posture of application 108 displays, yet towards or away from the posture that application 108 displays move a distance, be not limited only to zooming in or out from application.
As previously mentioned, but be not limited only to mentioned order, user's posture can directly be controlled diagram 110 or application 108 views based on posture.In addition, user's posture can ad hoc be controlled corresponding to menu, such as opening file, shared file or preserve file.In addition, in certain embodiments, more than one user can be by using the order based on posture to control application.
Fig. 4 illustrates by using two mutual users of collaborative posture and application 108.Specifically, that user 112 and user 408 are shown is mutual with application 108 synergistically for Fig. 4.For user 408, the system 106 further user 408 in the image based on caught creates the second skeletal graph, and the second skeletal graph based on user 408 is identified posture to create the posture of the second identification.System is by adding or revising object in three-dimensional views 110 posture based on user 112 and realize order, and the identification posture based on user 408 realizes order subsequently, revises the object in three-dimensional views 110.For example, user 112 makes posture 404 " to draw " earthquake line 412 in seismic volume, as applies as shown in the diagram 110 on 108 displays.User 408 can, by making posture 406 to select expectation earthquake line 412 and then to make posture 410 so that earthquake line 412 is moved to selectable location, revise the placement of drawn earthquake line 412 subsequently.System 106 identification two users' posture the posture based on identified realize these orders.The posture of drawing and revise earthquake line on earthquake volume of being made by user 112 and 408 is the example how collaborative posture affects application, yet is not limited only to so mutual with mutual two or more users of application.
Although skeletal graph and bone joint sign can contain whole health, however also can be health less, select part (such as user's hand) and create skeletal graph.Turn to now Fig. 5, in certain embodiments, system 106 creates the skeletal graph of user's hand.Specifically, the leftmost diagram of Fig. 5 looks like to illustrate the image of the hand (such as user 112 hand 102) being caught by system 105.The intermediate image of Fig. 6 illustrates the image of the hand 102 that is coated with the corresponding skeletal graph 500 being created by system 106.In the far right images of Fig. 5, with some bones joint (such as thumb joint 502), skeletal graph 500 is shown.When user 112 uses hand to make posture, system 106 can be identified this posture and realize corresponding order.For example, by moving his thumb, user can make by posture " click " order with choice menus item, wherein system 106 catches the motion in bone joints 502, and this motion is identified as with for example " click " with the posture of identification corresponding to choice menus item (as described in greater detail below).In another embodiment, user 112 can make with hand 102 " swiping (swip) " motion to represent the order of shift applied view by posture.In another embodiment, user 112 can clench fist with hand 102, and the posture when front view of application is closed in indication.In another embodiment, also can control the menu with association by hand posture.
Turn to Fig. 6, Fig. 6 illustrates making for controlling with petro-technology association and as another embodiment of the menu of one partial display by posture.Specifically, Fig. 6 illustrates the posture 606 for Control-Menu 600.In this embodiment, user 112 makes posture 606 with mutual with menu 600.Other posture of describing before being similar to, the concrete posture of menu control can be programmed.In one embodiment, user 112 can make posture to draw the cursor in application 108.User 112 can move his hand 102 subsequently, controls the path of cursor on " menu " icon 600, and makes " click " posture 606, just as by click physics mouse button." click " posture 606 can be corresponding to activating menu 600, and the activation of menu 600 can provide entremets uniterming.Removable his hand 102 of user 112, mobile cursor 608 in application 108 views, thus select and activate more menu options, " preservation " of " opening " and being represented by icon 604 such as the menu option being represented by icon 602." click " of opening " click " of menu and activating other menu option is that identified posture can be done and so on some examples, but should " click " posture be not limited only to menu control order, is also not limited to only by described exemplary ability of posture control menu.
As discussed up to now, the physics posture that can be made by one or more users by system 106 identifications realizes order with the posture based on being identified.Yet, also can or with voice command, come to application 108 issue an orders independently by voice command and physics combination of gestures.Specifically, system 106 can receive the Audio and Video data corresponding with the user who controls application 108 by physics and audio frequency posture form.For example, in one embodiment, can control application by the user who makes posture with its right hand.Want application controls to switch to another hand, user is bounced his both hands to issue the order that changes hand together by issue an order.The audio sound that system is bounced two hands is together identified as order, and the physics posture that identification is clapped hands is to change the control of the custom with the right hand or left hand (handedness) of application.Though this exemplary embodiment combination physics and audio frequency posture, also can be separately by physics posture, by audio frequency posture or by the combination of physics and audio frequency posture, realize order separately.
In another embodiment, the combination of physics and audio frequency posture can help to order more accurately realization.For example, user 112 can wish to rotate lucky 43 degree of 3-D view 110 around x axle.Hand posture itself possibly cannot be made the posture of motion 43 degree exactly; Yet in conjunction with physics posture, user 112 can issue verbal order to stop operating after turning over 43 degree.In another embodiment, can operate as follows with mutual two users of application, one of them user uses physics posture to give an order, and the second user by issue verbal order to first user order modify or add.Use separately above-mentioned audio frequency posture or with physics combination of gestures, be the example based on the order of audio frequency posture, yet audio frequency posture is not limited only to, these are mutual.
Instructions turns to being described in more detail system 106 now.System 106 can be the set of hardware component and software part combination, hardware component works to catch user's image together with software part, create skeletal graph, identified posture (vision and/or audio frequency) is associated with to specific instructions, and carry out the order in application.Fig. 7 illustrates according to the nextport hardware component NextPort of the system 106 of each embodiment with block diagram form.Specifically, Fig. 7 illustrates sensor device 702, computer system 704 and display device 706.
First turn to the seizure with user-dependent image and voice data, sensor device 702 can comprise for catching a plurality of assemblies with user-dependent image and audio frequency.Sensor device 702 can be configured to by the view data that catches arbitrarily user in various video input option.In one embodiment, view data can be caught by one or more colours or B/W camera 710.In another embodiment, can by with two or more physically discrete stereoscopic camera 712 catch view data, described stereoscopic camera 712 checks that user is to catch depth information from different perspectives.In another embodiment, can catch view data by detecting the infrared sensor 714 of infrared light.Can catch audio frequency by microphone 716 or two or more stereophony microphones 718.In one embodiment, sensor device 702 can comprise one or more cameras and/or microphone; Yet in other embodiments, video and/or audio capture device can be from coupled outside to sensor device 702 and/or computer system 704.
Sensor device 702 can be coupled to computer system 704 by the wired connection such as USB (universal serial bus) (USB) connection or live wire connection, or is coupled to computer system 704 by wireless connections mode.In one embodiment, computer system 704 is stand-alone computer, and computer system 704 can be the computing machine of one group of networking in other embodiments.In another embodiment, sensor device 702 and computer system 604 can comprise integrated equipment 708 (laptop computer, notebook, flat board or the smart phone for example in lid with sensor device).Sensor device 702 and computer system 704 are coupled to display device 706.In one embodiment, display device 706 can be monitor (for example liquid crystal display, plasma monitor or CRT monitor).In other embodiments, display device 706 can be projection arrangement, and it projects to application in two-dimensional surface.Instructions turns to now to being described in more detail of the software of system 106 as shown in Figure 7, and Fig. 7 illustrates the diagram of the various software assembly of can combined sensor equipment 702 working to realize each embodiment together with computer system 704.
Computer software
Computer software 704 can comprise a plurality of component softwares, comprises one or more bones tracking application programming interface (API) 802, Bone Tools case software 804, the application controls software 806 based on posture and software library 808.Below by each the do explanation one by one in these softwares.
Bone is followed the tracks of API802 and is concentrated on the functional software storehouse that realtime graphic is processed, and is being that sensor device 702 provides support catching and follow the tracks of aspect body action, simultaneously to audio data capture provide support (for example by
the open source code API OpenCV of research and development or the OpenNI obtaining from OpenNI tissue).As discussed earlier, sensor device 702 catches user's image.API802 creates the skeletal graph bone joint motions that also tracking can be corresponding with posture that are associated subsequently, to control application.Bone Tools case 804 (the softness action and joint skeleton tool box or the FAAST that are for example developed by the innovative technology mechanism of University of California), help to use integrated that the application controls based on posture of skeletal graph and bone joint follow the tracks of, can to follow the tracks of API802 mutual with bone.In another embodiment, Bone Tools case 804 does not need with bone tracking API802 mutual, but the application controls software 806 based on posture is mutual with other, thereby analyzes posture and posture is associated in to order to control petro-technology application.When the joint motions of API802 analyzing bone, it compares this motion and the storehouse of the posture of identifying.If the motion match of this motion and the posture of identifying, the order being associated that system 106 realizes in application.For example, although can there is the predefine storehouse (the gesture recognition storehouse 818 in the application controls software based on posture 806,818) of identified bone joint posture, yet Bone Tools case can allow bone joint posture and the application controls pair of the new identification of user add.
Be combined with other software, software library 808 can provide catching image, identification posture and realize the additional support of ordering in application.Three example pool shown in Fig. 8, but the storehouse that can use any quantity or type.In Fig. 8, geology storehouse 814 provides the support to the simulation of specific geophysics and geologic data (such as geological formations and scene).Shape library 816 can help to support to play up shape and text message.
Although described up to now autonomous system in instructions, yet similarly function can realize by plug-in module being bonded to existing independent petro-technology application software.More specifically, each software that carry out to catch image, creates skeleton diagram, follows the tracks of skeleton joint motion, identification posture and realize the order based on posture can be added into and operate in the identical or application controls software being pre-existing on hardware system independently.
Example calculation environment
Each embodiment discussing for this point is in conjunction with multi-form computer system operation.For example, computer system 704 can be desktop type or laptop system, or can be attached in individual system together with sensor device 702.
Fig. 9 illustrates the computer system 704 according at least some embodiment.Relate in interactive application, catch user images, create skeletal graph, follow the tracks of bone joint motions, identification posture and carry out arbitrary embodiment in all embodiment of posture-order pairing can be completely or partially in realization in all computer systems as shown in Figure 9, or in the computer system of in the future researching and developing.Specifically, computer system 704 comprises the primary processor 910 that is coupled to main memory array 912 and multiple other peripheral hardware computer system component by integrated master control bridge 914.Primary processor 910 can be uniprocessor nuclear equipment or the processor of realizing a plurality of processor cores.In addition, computer system 704 can realize a plurality of primary processors 910.Primary processor 910 is coupled to master control bridge 914 by host bus 916, or master control bridge 914 can be integrated in primary processor 910.What therefore, computer system 704 can be used as structure shown in Fig. 9 additionally or alternatively realizes other bus configuration or bus bridge.
Primary memory 912 is coupled to master control bridge 914 by memory bus 918.Therefore, master control bridge 914 comprises memory control unit, and this memory control unit is controlled the affairs with primary memory 912 by asserting for the control signal of memory access.In other embodiments, primary processor 910 is directly realized memory control unit, and primary memory 912 can directly be coupled to primary processor 910.Primary memory 912 is used as the working storage of primary processor 910 and comprises memory device or array of storage devices, program storage, instruction and data in memory device.Primary memory 912 can comprise the storer of any suitable type, any such as in dynamic RAM (DRAM) or polytype DRAM equipment, such as synchronous dram (SDRAM), growth data output DRAM (EDODRAM) or memory bus DRAM (RDRAM).Primary memory 912 is examples of the non-transient computer-readable medium of storage program and instruction, and other example is disk drive and flash memory device.
Illustrative computer system 704 also comprises the second bridge 828, and it bridges to auxiliary expansion bus by main expansion bus 926, such as low pin count (LPC) bus 930 and peripheral component interconnect (PCI) bus 932.Can support multiple other auxiliary expansion bus by bridging device 928.
Firmware maincenter 936 is coupled to bridging device 928 by lpc bus 930.Firmware maincenter 936 comprises ROM (read-only memory) (ROM), and it comprises the software program that can be carried out by primary processor 910.This software program comprises in power-on self-test (POST) process or rear program and the memory reference code of carrying out.Before the control of computer system is transferred to operating system, the various functions in POST process and memory reference code computer system.Computer system 604 further comprises that diagram is coupled to the network interface unit (NIC) 938 of pci bus 932.NIC938 is used for computer system 704 to be coupled to the communication network such as the Internet or LAN (Local Area Network) or wide area network.
Still referring to Fig. 9, computer system 704 can further comprise super I/O (I/O) controller 940, and this super I/O controller 940 is coupled to bridge 928 by lpc bus 930.Super I/O controller 940 is controlled many computer system functions, for example, with multiple input and output device interface, input and output device for example, such as the pointing device, various serial port, floppy disk and the disk drive that are keyboard 942, pointing device 944 (mouse), occur with game console 946 forms.Super I/O controller 940 is often called as " super ", is because it carries out many I/O functions.
Computer system 704 can further comprise the Graphics Processing Unit (GPU) 950 that is coupled to master control bridge 914 by the bus 952 such as quick (PCI-E) bus of PCI or advanced graphics process (AGP) bus.Can use equally other bus system, comprise the bus system of research and development in the future.In addition, Graphics Processing Unit 950 is coupled to (for example pci bus 932) in main expansion bus 926 or auxiliary expansion bus alternatively.Graphics Processing Unit 950 is coupled to display device 954, and this display device 954 can comprise any suitable electronic display unit that can draw and/or show any image or text thereon.Graphics Processing Unit 950 can comprise that plate borne processor 856 and plate carry storer 958.Processor 956 thus can be according to the command execution graphics process of being sent by primary processor 910.In addition, storer 958 can be jumbo, on hundreds of megabyte or the larger order of magnitude.Thus, once be given an order by primary processor 910, Graphics Processing Unit 950 can be carried out a large amount of calculating relevant with being presented at figure on display device and finally show these figures, and does not need further input or the assistance of primary processor 910.
To method that control interactive application by the use of posture be described in more detail below.Figure 10 illustrates the process flow diagram of describing according to total method of the use ability of posture control application of sample embodiment.Method starts (circle 1000), and moves to the view (frame 1002) of controlling application.The view of controlling application starts from catching user's image (frame 1004).User based on catching in image creates skeletal graph (frame 1006).If user makes posture, the skeletal graph based on creating in frame 1004 is identified this posture (frame 1008).If are postures corresponding with order from the posture of frame 1004 identifications, the posture based on identified realizes this order (frame 1010).Afterwards, method finishes (frame 1012).
From the explanation providing herein, those skilled in that art are easy to the software creating as previously mentioned be combined with suitable universal or special computer hardware with establishment according to the computer system of each embodiment and/or computing machine sub-component, create for carry out the computer system of method of each embodiment and/or computing machine sub-component and/or establishment in order to storing software program to carry out the non-transient computer-readable recording medium (except signal or the carrier wave of advancing along conductor) of the method aspect of each embodiment.
Statement indication to " embodiment ", " embodiment ", " some embodiment ", " each embodiment " etc. comprises specific factor or characteristic at least one embodiment of the present invention.Although these phrases can occur in a plurality of positions, yet these phrases not necessarily point to same embodiment.
Discussion is above intended to the explanation as principle of the present invention and each embodiment.Once understand foregoing disclosure completely, many variations and modification are obvious for those skilled in that art.For example, although the control based on posture with regard to petro-technology application is described each component software, yet Developed Background should not be read as the restriction to described one or more scope of invention---same technology can analysis based on posture and realization for other equally.Be intended to claims to be below construed to and to contain all these variants and modifications.
Claims (30)
1. a method, comprising:
Control through the following steps the view of petro-technology application:
Seizure comprises the image of first user;
Described first user based in described image creates the first skeletal graph;
Based on described the first skeletal graph, identify described posture to create the posture of the first identification; And
Posture based on described the first identification realizes order.
2. the method for claim 1, is characterized in that,
Described identification further comprises the posture that the variation of described first user head position is identified as to described the first identification; And
Described realization further comprises the view that changes described petro-technology application.
3. the method for claim 1, is characterized in that,
Described identification further comprises the posture that the change of distance between described first user and camera is identified as to described the first identification; And
Described realization further comprises the zoom degree that changes described petro-technology application.
4. the method for claim 1, is characterized in that, described identification posture further comprises:
Training system is identified prime, and wherein said prime is Unidentified before being, and subsequently
Described prime is identified as to the posture of described the first identification.
5. the method for claim 1, is characterized in that, creates the first skeletal graph and further comprises:
Create the first skeletal graph of the hand of described first user; And subsequently
Identify the posture of described first skeletal graph of the described hand that relates to described first user.
6. the method for claim 1, is characterized in that, also comprises:
The second user based in described image creates the second skeletal graph;
Based on described the second skeletal graph, identify posture to create the posture of the second identification;
Wherein said realization is further included in the three-dimensional views in described view to be added or modification object; And
Posture based on described the second identification realizes order, and revises by this described object in the described three-dimensional views in described view.
7. the method for claim 1, is characterized in that,
Described identification further comprises that identification both hands bounce together; And
Wherein said realization further comprises the control break that described petro-technology is applied to different hands.
8. method as claimed in claim 7, is characterized in that, described method also comprises that the audible sound based on being received by least one microphone verifies that both hands bounce together.
9. the method for claim 1, is characterized in that, described method also comprises:
The audible sound being received by least one microphone that identification is relevant with the posture of described the first identification;
Based on described audible sound, realize described order.
10. the method for claim 1, is characterized in that, described identification further comprises by calculating movement between one or more video cameras determines mobile distance.
11. 1 kinds of computer systems, comprising:
Processor;
Be coupled to the storer of described processor;
Be coupled to the display device of described processor;
Described memory stores program, when carrying out described program by described processor, makes described processor:
By being operationally coupled to the camera of described processor, catch the image that comprises first user;
Described first user based in described image creates the first skeletal graph;
Based on described the first skeletal graph, identify described posture to create the posture of the first identification;
Posture based on described the first identification realizes order; And by this
Change the three-dimensional views that is presented at the stratum on described display device.
12. computer systems as claimed in claim 11, is characterized in that,
When described processor identification, described program further makes described processor the variation of the head position of described first user is identified as to the posture of described the first identification; And
Wherein, when described processor is realized, described program further makes described processor change the view that is presented at the described three-dimensional formation on described display device.
13. computer systems as claimed in claim 11, is characterized in that, also comprise:
Be coupled to the camera system of described processor;
Wherein, when described processor identification, described program further makes described processor the change of distance between described first user and described camera is identified as to the posture of described the first identification; And
Wherein, when described processor is realized, described program further makes described processor change the zoom degree that is presented at the described three-dimensional formation on described display device.
14. computer systems as claimed in claim 13, is characterized in that, also comprise at least one that is selected from following group: stereoscopic camera, black and white camera, color camera and infrared sensor.
15. computer systems as claimed in claim 11, is characterized in that, when described processor is identified described posture, described program further makes described processor:
Train described system to identify prime, wherein said prime is Unidentified before being, and subsequently
Described prime is identified as to the posture of described the first identification.
16. computer systems as claimed in claim 11, is characterized in that, when described processor creates the first skeletal graph, described program further makes described processor:
Create the first skeletal graph of the hand of described first user; And subsequently
Identify the posture of described first skeletal graph of the hand that relates to described first user.
17. computer systems as claimed in claim 11, is characterized in that, described program makes described processor:
The second user based in described image creates the second skeletal graph;
Based on described the second skeletal graph, identify posture to create the posture of the second identification;
Wherein, when processor is realized, described program further makes described processor realize interpolation or modification object in the described three-dimensional views showing in described display device; And
Posture based on described the second identification realizes order, and revises by this object in the three-dimensional formation showing on described display device.
18. computer systems as claimed in claim 11, is characterized in that, also comprise:
Be coupled to the microphone of described processor;
Wherein, when described processor identification, described program further makes the voice recognition both hands of described processor based on being received by described microphone bounce together; And
Described realization further comprises that change is to being presented at the control of the view of the three-dimensional formation on described display device.
19. computer systems as claimed in claim 18, is characterized in that, described program further makes described processor verify that based on audible sound both hands bounce together.
20. computer systems as claimed in claim 11, is characterized in that, described program further makes described processor:
The audible sound that identification is relevant with the posture of described the first identification; And
Based on described audible sound, realize described order.
21. 1 kinds of non-transient computer-readable mediums of storing instruction, when carrying out described instruction by processor, make described processor:
By making described processor, carry out the view that the following step is controlled petro-technology application:
Seizure comprises the image of first user;
Described first user based in described image creates the first skeletal graph;
Based on described the first skeletal graph, identify described posture to create the posture of the first identification;
Posture based on described the first identification realizes order.
22. non-transient computer-readable mediums as claimed in claim 21, is characterized in that,
When described processor identification, described instruction further makes described processor the variation of the head position of described first user is identified as to the posture of described the first identification; And
When described processor is realized, described instruction further makes described processor change the view of described petro-technology application.
23. non-transient computer-readable mediums as claimed in claim 21, is characterized in that,
When described processor identification, described instruction further makes described processor the change of distance between described first user and described camera is identified as to the posture of described the first identification;
When described processor is realized, described instruction further makes described processor change the zoom degree of described petro-technology application.
24. non-transient computer-readable mediums as claimed in claim 21, is characterized in that, described instruction further makes described processor:
Train described system identification prime, wherein said prime is Unidentified before being, and subsequently
Described prime is identified as to the posture of described the first identification.
25. non-transient computer-readable mediums as claimed in claim 21, is characterized in that,
When described processor creates, described instruction further makes described processor create the first skeletal graph of the hand of described first user; And subsequently
Wherein, when described processor identification, described instruction further makes described processor identify the posture of described first skeletal graph of the described hand that relates to described first user.
26. non-transient computer-readable mediums as claimed in claim 21, is characterized in that, described instruction further makes described processor:
The second user based in described image creates the second skeletal graph;
Based on described the second skeletal graph, identify posture to create the posture of the second identification;
Wherein, when processor is realized, described program further makes described processor in three-dimensional views, realize interpolation or revises object; And
Posture based on described the second identification realizes order, and revises by this object in described three-dimensional views.
27. non-transient computer-readable mediums as claimed in claim 21, is characterized in that,
When described processor identification, described instruction further makes described processor identification both hands bounce together; And
When described processor is realized, described instruction further make described processor by the control break of petro-technology application to different hands.
28. non-transient computer-readable mediums as claimed in claim 21, is characterized in that, described instruction further makes described processor verify that based on audible sound both hands bounce together.
29. non-transient computer-readable mediums as claimed in claim 21, is characterized in that, described instruction further makes described processor:
The audible sound that identification is relevant to the posture of described the first identification;
Based on described audible sound, realize described order.
30. non-transient computer-readable mediums as claimed in claim 21, is characterized in that, described instruction further makes described processor catch infrared frequency.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161535454P | 2011-09-16 | 2011-09-16 | |
US201161535779P | 2011-09-16 | 2011-09-16 | |
US61/535,779 | 2011-09-16 | ||
US61/535,454 | 2011-09-16 | ||
PCT/US2012/044027 WO2013039586A1 (en) | 2011-09-16 | 2012-06-25 | Methods and systems for gesture-based petrotechnical application control |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103975290A true CN103975290A (en) | 2014-08-06 |
Family
ID=47883599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201280045095.8A Pending CN103975290A (en) | 2011-09-16 | 2012-06-25 | Methods and systems for gesture-based petrotechnical application control |
Country Status (8)
Country | Link |
---|---|
US (1) | US20140157129A1 (en) |
EP (1) | EP2742403A4 (en) |
CN (1) | CN103975290A (en) |
AU (1) | AU2012309157B2 (en) |
BR (1) | BR112014006173A2 (en) |
CA (1) | CA2848624C (en) |
MX (1) | MX2014003131A (en) |
WO (1) | WO2013039586A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013147875A2 (en) * | 2012-03-30 | 2013-10-03 | Landmark Graphics Corporation | System and method for automatic local grid refinement in reservoir simulation systems |
US9672389B1 (en) * | 2012-06-26 | 2017-06-06 | The Mathworks, Inc. | Generic human machine interface for a graphical model |
US9245068B1 (en) | 2012-06-26 | 2016-01-26 | The Mathworks, Inc. | Altering an attribute of a model based on an observed spatial attribute |
US9607113B1 (en) * | 2012-06-26 | 2017-03-28 | The Mathworks, Inc. | Linking of model elements to spatial elements |
US9582933B1 (en) | 2012-06-26 | 2017-02-28 | The Mathworks, Inc. | Interacting with a model via a three-dimensional (3D) spatial environment |
US9117039B1 (en) | 2012-06-26 | 2015-08-25 | The Mathworks, Inc. | Generating a three-dimensional (3D) report, associated with a model, from a technical computing environment (TCE) |
US10360052B1 (en) | 2013-08-08 | 2019-07-23 | The Mathworks, Inc. | Automatic generation of models from detected hardware |
JP2015056141A (en) * | 2013-09-13 | 2015-03-23 | ソニー株式会社 | Information processing device and information processing method |
US10220304B2 (en) | 2013-10-14 | 2019-03-05 | Microsoft Technology Licensing, Llc | Boolean/float controller and gesture recognition system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090228841A1 (en) * | 2008-03-04 | 2009-09-10 | Gesture Tek, Inc. | Enhanced Gesture-Based Image Manipulation |
CN101788876A (en) * | 2009-01-23 | 2010-07-28 | 英华达(上海)电子有限公司 | Method for automatic scaling adjustment and system therefor |
US20100281438A1 (en) * | 2009-05-01 | 2010-11-04 | Microsoft Corporation | Altering a view perspective within a display environment |
US20100306715A1 (en) * | 2009-05-29 | 2010-12-02 | Microsoft Corporation | Gestures Beyond Skeletal |
US20110154266A1 (en) * | 2009-12-17 | 2011-06-23 | Microsoft Corporation | Camera navigation for presentations |
CN102117117A (en) * | 2010-01-06 | 2011-07-06 | 致伸科技股份有限公司 | System and method for control through identifying user posture by image extraction device |
CN102129551A (en) * | 2010-02-16 | 2011-07-20 | 微软公司 | Gesture detection based on joint skipping |
CN102184020A (en) * | 2010-05-18 | 2011-09-14 | 微软公司 | Method for manipulating posture of user interface and posture correction |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8418085B2 (en) * | 2009-05-29 | 2013-04-09 | Microsoft Corporation | Gesture coach |
US8400398B2 (en) * | 2009-08-27 | 2013-03-19 | Schlumberger Technology Corporation | Visualization controls |
US8843857B2 (en) * | 2009-11-19 | 2014-09-23 | Microsoft Corporation | Distance scalable no touch computing |
US9268404B2 (en) * | 2010-01-08 | 2016-02-23 | Microsoft Technology Licensing, Llc | Application gesture interpretation |
US20120144306A1 (en) * | 2010-12-02 | 2012-06-07 | Michael James Moody | Method and system for interacting or collaborating with exploration |
US8994718B2 (en) * | 2010-12-21 | 2015-03-31 | Microsoft Technology Licensing, Llc | Skeletal control of three-dimensional virtual world |
-
2012
- 2012-06-25 BR BR112014006173A patent/BR112014006173A2/en not_active IP Right Cessation
- 2012-06-25 EP EP12832115.5A patent/EP2742403A4/en not_active Withdrawn
- 2012-06-25 CA CA2848624A patent/CA2848624C/en not_active Expired - Fee Related
- 2012-06-25 MX MX2014003131A patent/MX2014003131A/en unknown
- 2012-06-25 CN CN201280045095.8A patent/CN103975290A/en active Pending
- 2012-06-25 US US14/131,924 patent/US20140157129A1/en not_active Abandoned
- 2012-06-25 AU AU2012309157A patent/AU2012309157B2/en not_active Ceased
- 2012-06-25 WO PCT/US2012/044027 patent/WO2013039586A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090228841A1 (en) * | 2008-03-04 | 2009-09-10 | Gesture Tek, Inc. | Enhanced Gesture-Based Image Manipulation |
CN101788876A (en) * | 2009-01-23 | 2010-07-28 | 英华达(上海)电子有限公司 | Method for automatic scaling adjustment and system therefor |
US20100281438A1 (en) * | 2009-05-01 | 2010-11-04 | Microsoft Corporation | Altering a view perspective within a display environment |
US20100306715A1 (en) * | 2009-05-29 | 2010-12-02 | Microsoft Corporation | Gestures Beyond Skeletal |
US20110154266A1 (en) * | 2009-12-17 | 2011-06-23 | Microsoft Corporation | Camera navigation for presentations |
CN102117117A (en) * | 2010-01-06 | 2011-07-06 | 致伸科技股份有限公司 | System and method for control through identifying user posture by image extraction device |
CN102129551A (en) * | 2010-02-16 | 2011-07-20 | 微软公司 | Gesture detection based on joint skipping |
CN102184020A (en) * | 2010-05-18 | 2011-09-14 | 微软公司 | Method for manipulating posture of user interface and posture correction |
Also Published As
Publication number | Publication date |
---|---|
AU2012309157B2 (en) | 2015-12-10 |
MX2014003131A (en) | 2014-08-27 |
BR112014006173A2 (en) | 2017-06-13 |
CA2848624A1 (en) | 2013-03-21 |
CA2848624C (en) | 2019-09-03 |
WO2013039586A1 (en) | 2013-03-21 |
US20140157129A1 (en) | 2014-06-05 |
EP2742403A4 (en) | 2015-07-15 |
EP2742403A1 (en) | 2014-06-18 |
AU2012309157A1 (en) | 2014-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11954808B2 (en) | Rerendering a position of a hand to decrease a size of a hand to create a realistic virtual/augmented reality environment | |
US11392212B2 (en) | Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments | |
US11663785B2 (en) | Augmented and virtual reality | |
CN103975290A (en) | Methods and systems for gesture-based petrotechnical application control | |
CN105637564B (en) | Generate the Augmented Reality content of unknown object | |
US9886102B2 (en) | Three dimensional display system and use | |
Wang et al. | Mixed reality in architecture, design, and construction | |
CN110070556B (en) | Structural modeling using depth sensors | |
Wang | Augmented reality in architecture and design: potentials and challenges for application | |
US20160358383A1 (en) | Systems and methods for augmented reality-based remote collaboration | |
US20120116728A1 (en) | Click to accept as built modeling | |
US11449189B1 (en) | Virtual reality-based augmented reality development system | |
Kodeboyina et al. | Low cost augmented reality framework for construction applications | |
Ge et al. | Integrative simulation environment for conceptual structural analysis | |
Afif et al. | Orientation control for indoor virtual landmarks based on hybrid-based markerless augmented reality | |
Fischbach et al. | smARTbox: out-of-the-box technologies for interactive art and exhibition | |
Nóbrega et al. | Design your room: adding virtual objects to a real indoor scenario | |
Agrawal et al. | HoloLabel: Augmented reality user-in-the-loop online annotation tool for as-is building information | |
Asiminidis | Augmented and Virtual Reality: Extensive Review | |
Huo | Exploration, Study and Application of Spatially Aware Interactions Supporting Pervasive Augmented Reality | |
US20230351706A1 (en) | Scanning interface systems and methods for building a virtual representation of a location | |
Simon | Immersive image-based modeling of polyhedral scenes | |
De Sousa et al. | 5* magic wand: An rgbd camera-based 5 dof user interface for 3d interaction | |
González et al. | An immersive 3D geological and mining data visualization environment | |
Takashima | FACULTY OF GRADUATE STUDIES |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20140806 |