WO2010046541A1 - Method and device for controlling an application - Google Patents

Method and device for controlling an application Download PDF

Info

Publication number
WO2010046541A1
WO2010046541A1 PCT/FI2009/050856 FI2009050856W WO2010046541A1 WO 2010046541 A1 WO2010046541 A1 WO 2010046541A1 FI 2009050856 W FI2009050856 W FI 2009050856W WO 2010046541 A1 WO2010046541 A1 WO 2010046541A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
application
command
auditory
executing
Prior art date
Application number
PCT/FI2009/050856
Other languages
French (fr)
Inventor
Raine Kajastila
Tapio Lokki
Original Assignee
Teknillinen Korkeakoulu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknillinen Korkeakoulu filed Critical Teknillinen Korkeakoulu
Publication of WO2010046541A1 publication Critical patent/WO2010046541A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
    • H04M1/236Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof including keys on side or rear faces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/60Substation equipment, e.g. for use by subscribers including speech amplifiers
    • H04M1/6033Substation equipment, e.g. for use by subscribers including speech amplifiers for providing handsfree use or a loudspeaker mode in telephone sets
    • H04M1/6041Portable telephones adapted for handsfree use
    • H04M1/6058Portable telephones adapted for handsfree use involving the use of a headset accessory device connected to the portable telephone
    • H04M1/6066Portable telephones adapted for handsfree use involving the use of a headset accessory device connected to the portable telephone including a wireless connection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the invention relates to a method and mobile device for executing a function in an application.
  • the invention also relates to a method and device for executing an application controlled by a mobile device.
  • Mobile devices and especially mobile stations and personal digital assistants, (PDAs) have offered a good foundation for developing such devices, which can be carried all the time, which are available outside the home, and which are capable of Internet browsing or game playing irrespective of a user's location, in order to solve the aforesaid problem.
  • PDAs personal digital assistants
  • the mobile stations and PDAs are widely used and those are carried by people almost constantly due to their versatile characters. So, the mobile devices play an important role in people's everyday life.
  • the present mobile devices have user interfaces for inputting data and instructions such as a keyboard, function buttons, a tracking ball, a small "joystick" in the middle of the keyboard, and touchpad, but theirs usability for device or application controlling are limited since theirs usage requires user's both hands when he/she controls an application. Thus, it is impossible to take notes together with the application control.
  • the user's stance can be quite uncomfortable if he/she is unable to reach a static stance, e.g. by sitting in a chair.
  • the application control by the above-mentioned user interfaces of the mo- bile device is very troublesome if e.g. the user has to move or he/she travels by a congested bus, subway, or train during rush hour
  • One object of the invention is to provide a method for executing a function in an application, mobile device enabling to execute a function in an application, a method for executing an application controlled by a mobile device, and a device enabling to execute an application controlled by a mobile device.
  • the object of the invention is fulfilled by providing a method, wherein information relating to an orientation of a mobile device is obtained, said information relating to the orientation of the mobile device is associated to a command for executing a function in an application, and said command to the application for executing said function is provided in said application.
  • the object of the invention is also fulfilled by providing a mobile device, which comprises means for obtaining information relating to an orientation of said device, means for associating said information relating to the orientation of the mobile device to a command for executing a function in an application, and means for providing said command to the application for executing said function in said application.
  • the object of the invention is also fulfilled by providing a method, wherein a com- mand relating to an orientation of a mobile device controlling an application for executing a function in said application is received, said command is executed in said application, and a feedback relating to said received command is provided.
  • the object of the invention is also fulfilled by providing a device, which comprises means for receiving a command relating to an orientation of a mobile device con- trolling an application for executing a function in said application, means for executing said command in said application, and means for providing a feedback relating to said received command.
  • an auditory user interface is controlled by a user who points and tilts a mobile station in his/her hand to different di- rections for browsing a spherical ego-centric auditory menu comprising one or more auditory objects.
  • Targeted auditory menu objects are indicated with speech or other sounds and reproduced from corresponding directions with a three dimensional audio.
  • the synthesised targeted auditory menu objects are transmitted from the mobile station, which possibly does not comprise a display at all, to user's headphones so that a three dimensional auditory space is established around the user.
  • An embodiment of the present invention relates to a method according to inde- pendent claim 1.
  • an embodiment of the present invention relates to a mobile device according to independent claim 17.
  • an embodiment of the present invention relates to a method according to independent claim 32.
  • an embodiment of the present invention relates to a device according to independent claim 37.
  • a method comprises obtaining information relating to an orientation of a mobile device capable of transmitting and re- ceiving information, e.g. a mobile station or personal digital assistant (PDA), associating said information relating to the orientation of the mobile device to a command, which executes a function such as the selection of the user interface or game object, the movement of the object, or the copying or pasting of the object etc. in an application, and providing said command to the application for executing said function in said application.
  • the application can be e.g. the user interface, voting application, or game of the mobile device or other remote device such as a mobile station, computer, digital video disc (DVD) device, set-top box, video recorder, or sound reproduction equipment.
  • orientation of a mobile device refers situation wherein a mobile device is in user's hand and it is tilted or pointed to a certain direction by means of the motion of the user's wrist, whereupon it is achieved a certain orientation to said mobile device.
  • the method which is disclosed in the previous embodiment, further comprises establishing a connection to said applica- tion, wherein the user of the mobile device establishes the connection to the application for enabling to carry out a desired function.
  • the method which is disclosed in any of the previous embodiments, comprises that said connection to the application is a wireless connection such as a radio frequency connection (e.g. Bluetooth, Wi- bree, and WiFi), or infrared connection.
  • the method comprises that said obtaining information relating to the orientation of the mobile device comprises determining information relating to a tilt angle and tilt direction of the mobile device, when it is in a user's hand and the mobile device is pointed to a certain direction or rotated by the natural wrist movement of the user (as a traditional joystick), from accelerometers implemented into the device. It uses accelerometers along three axes in order to sense the orientation of the mobile device relative to the direction of the earth's gravity.
  • the method comprises that said obtaining information relating to the orientation of the mobile device further comprises determining whether a key or button of the mobile device, which comprises one button (key) and does not comprise any keyboard or display in minimum, has been pressed for selecting the function to be executed.
  • the determination relates also to a double- clicking, a touchpad pushing, mobile device squeezing, mobile device shaking etc.
  • button also refers to a touchpad, device squeezing, and device shaking which can provide, as well as the button, information relating to a function selection.
  • the method comprises that said associating information relating to the orientation of the mobile device to the command for executing the function comprises deducing the command, which is directed to a controlled application, for executing the desired function on the grounds of the tilt angle and tilt direction information relating to an orientation of the mobile device in a user's hand and the button pressing information, which can comprise an indication relating to a realised at least one button click or an interfered button click.
  • the method comprises that said providing the command to the application for executing said function comprises sending the command from a mobile device unit processing the association to said application so that the appli- cation is able to carry out the desired operation according to the transferred command.
  • the method which is disclosed in any of the previous embodiments, comprises that said method further comprises receiving an audio feedback, a visual feedback, or a haptic feedback relating to the sent command from said application.
  • the method which is disclosed in any of the previous embodiments, comprises that said application is provided in another device, e.g. a mobile station, a computer, digital video disc device, set-top box, video recorder, or sound reproduction equipment, and said connection estab- lishment to the application and said command transfer to the application is provided through an air interface between the mobile device and said another device by means of e.g. a radio frequency or infrared.
  • another device e.g. a mobile station, a computer, digital video disc device, set-top box, video recorder, or sound reproduction equipment
  • said connection estab- lishment to the application and said command transfer to the application is provided through an air interface between the mobile device and said another device by means of e.g. a radio frequency or infrared.
  • the method which is disclosed in any of the previous embodiments, comprises that said another device is a device for playing a game, wherein a feedback is provided visually and/or audibly.
  • the visual feedback is provided either the display of the game device or the display of the mobile device.
  • the method which is disclosed in any of the previous embodiments, comprises that said application, which receives a command produced by the movement (and possible one or more button clicks) of the mobile device, is provided in said mobile device.
  • the method which is disclosed in any of the previous embodiments, comprises that said application is a three dimensional auditory interface comprising a spherical auditory menu, which comprises one or more auditory objects in an auditory space.
  • the spherical auditory menu is also possible to provide visually on the display of the mobile device for illustrating the auditory objects on the surface of the spherical menu.
  • the auditory objects can also be somewhere else than on the sphere surface, e.g. different distances from the user.
  • the geometry of the visual menu can be e.g. a cylinder or a cube instead of said sphere.
  • the visual spherical menu comprises selectable (auditory) objects and a cursor representing the movement of the mobile station in the user's hand.
  • auditory objects enable basic use of a mobile station, e.g. dialling, sending a short message service (SMS) messages, a phonebook browsing, a playlist browsing, playback control, and a radio station surfing.
  • SMS short message service
  • the auditory objects comprises letters, numbers, commands enabling to execute function(s), links, (sub)menus etc.
  • the method comprises that said associating information relating to the orientation of the mobile device to the command to be transferred to an application comprising auditory objects on at least a spherical auditory menu for executing the function comprises associating information relating to the orientation of the mobile device to an auditory object information, which describes the pointed or button-click selected auditory object.
  • the method comprises that said providing the command to the application for executing said function further comprises sending the auditory object information to the application for receiving a feedback relating to said auditory object information, said feedback is provided by headphones or a loudspeaker system, which has a wireless connection to the mobile device.
  • the command is established based on auditory object information, it is transferred for the use of the application, which synthesised the command to a speech or other sounds such as an earcorn, spearcorn, auditory icon, and mixing speech, and re- produced it to a user.
  • the selected function can be e.g. writing letter "A", which is first pointed from an auditory menu by the mobile station and then selected by a button click.
  • the method which is disclosed in any of the previous embodiments, comprises that said feedback through a headphones or a loudspeaker system is provided by using a three dimensional sound so that a targeted or selected objects are reproduced from the correct directions of the targeted or selected objects in a three dimensional audio space.
  • said spherical auditory menu is an ego-centric auditory menu, which moves along with a user, or an exo-centhc menu used with head-tracking.
  • the method comprises receiving, by a device such as a mobile station, a computer, digital video disc (DVD) device, set-top box, video recorder, or sound reproduction equipment, a command relating to an orientation of a mobile device controlling an application carried out by the device for executing a desired function such as the selection of the user interface or game object, the movement of the object, or the copying or pasting of the object etc. in said application, executing said command in said application, and providing a feedback relating to said received command.
  • the application can be e.g. a user interface, voting application, or game.
  • the method which is disclosed in any of the previous embodiments, comprises that said method further comprises establishing a wireless connection with said mobile device, wherein a mobile device user establishes the connection for enabling to carry out a desired function in the application of the device.
  • the method which is disclosed in any of the previous embodiments, further comprises that an application controlled by a mobile device through an air interface is performed in a device.
  • the method further comprises that said providing the feedback relating to the received command further comprising transferring the caused feedback to the display of the device or mobile device or headphones of the mobile device user, a loudspeaker system connected to the device, or another loudspeaker system reaching the mobile device user for reproducing said feedback.
  • the method which is disclosed in any of the previous embodiments, comprises that said feedback illustrating a received command is sent through an air interface to said display, headphones or loudspeaker system.
  • Figure 1 illustrates an exemplary view of an arrangement comprising a mobile device and a controlled device according to an advantageous embodiment of the invention
  • Figure 2 illustrates an exemplary view of another arrangement comprising a mobile device and an auditory user interface according to an advanta- geous embodiment of the invention
  • FIG. 3 illustrates an exemplary flowchart of a method according to an advantageous embodiment of the invention
  • FIG. 4 illustrates an exemplary flowchart of another method according to an advantageous embodiment of the invention
  • Figure 5 illustrates an exemplary view of a control method of a visualised auditory interface according to an advantageous embodiment of the invention
  • Figure 6 illustrates an exemplary view of orientation information obtaining according to an advantageous embodiment of the invention
  • Figure 7 illustrates an exemplary view of a method for expanding a target area according to an advantageous embodiment of the invention
  • Figure 8 illustrates an exemplary view of another method for expanding a target area according to an advantageous embodiment of the invention
  • Figure 9 illustrates an exemplary view of a third method for expanding a target area according to an advantageous embodiment of the invention
  • Figure 10 illustrates an exemplary view of a mobile device according to an advantageous embodiment of the invention.
  • Figure 11 illustrates an exemplary view of a remote device according to an advantageous embodiment of the invention.
  • Figure 1 illustrates an arrangement 100, wherein a mobile station 110, which comprises three accelerometers for enabling to determine the orientation of the mobile station 110, communicates with a device 120 such as a desktop computer having a display 130, keyboard 140, and loudspeakers 150a, 150b by using a Bluetooth or infrared connection.
  • the computer 120 runs a graphical user interface comprising e.g. a cursor, graphical icons, and tool bar that are displayed on the display 130.
  • the user who has the mobile station 110 capable of communicating with the computer 120 wirelessly, wants to open a network browser such as Internet Explorer or Modzilla Firefox for searching information relating to a Finnish town called Mikkeli, he/she handles the mobile station 110 in his/her hand same way as a joystick by tilting it to a certain direction for moving the cursor on the display 130 towards the network browser icon and, eventually, on the browser icon.
  • a network browser such as Internet Explorer or Modzilla Firefox for searching information relating to a Finnish town called Mikkeli
  • the user presses the button of the keyboard 160 in the mobile station 110 for opening the network browser.
  • the loudspeakers 150a, 150b are in an active state, they reproduce a sound indicating the button clicking.
  • the user moves the cursor by pointing the mobile station 110 to a suitable direction on the address field of the browser window, activates the address field by clicking a mobile station button, and writes a desired address, e.g. www.mikkeli.fi by means of the mobile station 110 for displaying an Internet site providing information relating to Mikkeli.
  • the mobile station 110 comprising accelerometers for enabling to determine the orientation can also be utilised as the remote controller of the television (not shown), which visually illustrates a sphere menu on the display of the television.
  • the mobile station 110 is used for e.g. channel selecting or volume level adjusting.
  • a computer program e.g. a game application, which the computer 120 runs, is displayed on the display 170 of the mobile station and sounds relating to the game application are reproduced through the loudspeaker of the mobile station (not shown) or headphones (not shown) connected to the mobile station 110.
  • the connection between the mobile station 110 and the headphones is either through a wire or wireless.
  • the game application is displayed on the computer display 130 and game sounds are reproduced through the mobile station loudspeaker or headphones connected to the mobile station 110.
  • FIG 2 is illustrated another arrangement 200, wherein a mobile station 210 comprising means for determining the orientation of the mobile station such as three accelerometers implemented into the mobile station 210.
  • the processor (not shown) runs a program application, e.g. an auditory user interface application, in the mobile station 210 for providing the auditory user interface comprising one or more auditory objects 220a-220e to the user.
  • the auditory user interface enables that the mobile station 210 comprises only a single button 230 and possibly a dis- play, not any keyboard.
  • the mobile station 210 provides a auditory space around the user by means of a mobile station loudspeaker (not shown), headphones 240 (or one earphone) having a wired or wireless connection to the mobile station 210, or a loudspeaker comprising at least one loudspeaker (not shown) system around the user.
  • the loudspeaker system is also connected to the mobile station 210 through a wireless or wired connection.
  • the user wants to select an auditory object 220a illustrating an icon, which provides an access to a phonebook, he/she points the mobile station 210 to direction, in this case to the left, wherein the auditory icon 220a locates. Since the user can not see the icon 220a, the pointing is based on a user's estimation. Sounds aid to find the icon and help to remember the location.
  • the mobile device 210 comprises the display, he/she has a better knowledge relating to the location of the icon.
  • the user interface application provides an audio feedback through the mobile station loudspeaker, headphones 240, one earphone, or loudspeaker system, when the mobile station in the user's hand targets to the desired auditory icon 220a.
  • the feedback is possible to provide by using a three dimensional sound, whereupon the feedback comes from the direction wherein the targeted icon locates.
  • mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user.
  • the feedback noti- fies the user that there is possibility to access to the phonebook by selecting said icon 220a.
  • the user can browse the phonebook for selecting the number of the person to whom he/she wants to call, and make a call by pointing auditory objects 220a-220e in the auditory menu.
  • the user can also provide e.g. a SMS message sending, playlist browsing, playback controlling, and radio station surfing.
  • Figure 3 discloses, by means of an example only, a flow chart describing a method according to one embodiment of the invention.
  • a mobile device and/or an application such as the user interface executing a method, is turned on and necessary stages before a connection establishment, such as connection set up definition and dif- ferent parameters initialisation, are provided.
  • the user defines to which device he/she wants to establish a connection and adjusts different parameters, if needed, for enabling the connection.
  • this step includes sending from the mobile station inquiry requests for finding an available Bluetooth enabling device, such as a laptop computer, which listens these inquiry requests and respond to it if receives one.
  • step 304 the wireless connection between the mobile device and the other device is established according to the defined set up and parameters.
  • the mobile station carry out the connection (page) procedure while the laptop, which listens connection requests from the mobile device, is connectable (page scanning). The procedure is targeted and only the laptop responds to the connection procedure.
  • step 306 the other device executes its graphical application.
  • Steps 304 and 306 do not need to be in this order, since the application execution by the other device can be independent from step 304. On the other hand, it is possible that the connection establishment induces the application execution.
  • the orientation information which comprises e.g. the tilt and angle of the mobile device or the change of the tilt and/or angle, is obtained in step 308.
  • the user can only push a key or button without moving the mobile device in order to select or mark e.g. an object as step 310 describes.
  • the control method is ended in step 320.
  • the information relating to the orientation or the selection is established, the information is associated to a command to be executed in the application of the other device in step 312.
  • This can mean e.g. that the movement information indicated by Cartesian coordinates is used in the application as such or mapped to another coordinate system, such as rectangular or spherical coordinate system, and associated to the command. It is also possible that the mere coordinates establish the command, no matter which coordinate system they belong.
  • the command is then transferred to the other device and application through the established Bluetooth connection as step 314 teaches.
  • steps 312 and 314 can be in an opposite order, whereupon the in- formation relating to the orientation or the selection is transferred to the application, wherein it is mapped and associated to a command to be executed.
  • step 316 the desired command is executed and its consequences is displayed in the application on the display of the other device or reproduced through e.g. a loudspeaker or loudspeaker system.
  • step 318 the user estimates a need for further commands to the application and if there is the need for further commands, it is possible to return to step 308.
  • control method is ended in step 320.
  • figure 4 is also illustrated by means of an example only, a flow chart describing a method according to one embodiment of the invention.
  • Step 402 discloses a method start-up, wherein the mobile device such as a mobile station and its auditory user interface are turned on, and both perform necessary initialisation procedures.
  • the auditory user interface is executed so that it provides an auditory menu comprising auditory objects around the user of the mobile station.
  • the small picture in the left side of figure 5 presents such auditory menu 510 in a visualised form.
  • the auditory menu comprises several objects on the surface of the spherical auditory menu 520. In the figure, these auditory objects are numbers from zero to nine and a "back" command.
  • After the start-up there can be an audi- tory notification, which indicates that the user is now in a main menu including main menu auditory objects, which location in familiar static positions around the user and on the sphere surface.
  • a big object 530 on the sphere surface indicates a direction ("sight") where the mobile station has been targeted.
  • the right side of the figure depicts how auditory objects 540a-540f in the three dimensional auditory space surrounding the user can be accessed by pointing or tilting a mobile station 550, when it is in the user's hand 560.
  • the pointing and tilting are carried out by the natural movement of the user's wrist as a joystick.
  • step 406 in figure 4 where it is estimated whether a user points to a certain direction by the mobile station comprising three accelerometers for deter- mining the orientation in order to target a desired auditory object.
  • the orientation information which comprises e.g. the tilt and angle of the mobile device or the change of the tilt and/or angle, is obtained in this step.
  • step 408 the user can just click a button, touch a touchpad, squeeze the mobile station, or shake the mobile station without moving the mobile station for selecting an auditory object.
  • the information relating to the orientation is established, the information is associated in step 410 to a command to be executed in the auditory user inter- face.
  • the aforesaid means that the orientation information indicated by Cartesian coordinates is mapped to a spherical coordinate system and associated to the command. It is also possible that the mere spherical coordinates establish the command. If the selection information exists, it is associated to a command.
  • the accelerometers along three axes obtain the coordinates of the pointing direction, which are converted from the obtained accelerometer data affected by gravity and which are depicted with big objects 620a-620c on a sphere surface 610 in the small picture in the left side of the figure.
  • the acceleration data from one axis varies between 1 and -1 , being 1 when the axis is downwards, 0 when lateral, and -1 when facing up. If the mobile station 630 is exposed to higher accelerations the input data is clamped -1 and 1 to keep the values on the sphere surface.
  • Other two pointing directions are illustrated same way as the objects 620b and 620c.
  • step 412 it is determined whether the command to be executed is converted to speech or other sounds when the mobile station lacks a display. If the command is displayed on the display of the mobile station, then, the control method moves direct to step 416.
  • step 414 the command is converted to an audio form by a synthesizer utilising a text to speed algorithm or by playing recorded samples so that the pointing of the auditory object causes a feedback with a normal speech speed and the selection of the auditory object, for one, causes a feedback with the fast speech speed.
  • the menu sounds in a three dimensional auditory display are played one by one and the user has the ability to jump to listen to the next object, thus stopping the replay of the previous one. All menu objects can be heard one by one with slow gestural motion. If the user jumps to other side of the menu with fast gesture, no sound is heard until the gesture stops. When the user stops his/her gesture on a specific object, the sound is repeated with appropriate intervals.
  • step 416 Since the command is converted to the speech or other sounds it is transferred in step 416 to means for reproducing an audio feedback such as headphones or a loudspeaker system connected wirelessly to the mobile station.
  • step 418 the speech indicating the pointed or selected auditory menu is reproduced from correct direction with a three dimensional audio.
  • the feedback to the user can also be provided by using a mono or stereo sound instead of the three dimensional sound.
  • the user estimates in step 420 a need for further controlling of the auditory user interface and if there is the need for such, one can return to step 406.
  • the above described control method utilises the auditory user interface.
  • the auditory menu containing e.g. a dial menu, which enables dialling, listening to, and removing selected numbers
  • a SMS menu which contains all letters and a "space" for writing and an object for sending a written message
  • phonebook which comprises an alphabetically organised phonebook.
  • the phonebook can include multiple layers, e.g. alphabets and for each of them submenu with names.
  • the menu browsing happens by moving the "virtual sight", as it is mentioned earlier, amongst the menu objects, which are spread evenly on the surrounding circular space. Depending on the number of the objects the space between them is dy- namically changed.
  • each object can be accessed directly by pointing or tilting towards the known location and the auditory feedback of the targeted objects reveals if the right object was pointed, and if not the static order of the objects reveals where the desired object will be found since object locations are easy to remember with the longer use of the menu.
  • the auditory menu objects are separate objects that monitor if the "sight" is in their defined area.
  • Targeted objects send info to other objects in order to manage the positions of the other objects, fade their sound, and adjust their target sizes.
  • the auditory spherical menu also comprises a dynamically adjusted target sector, where objects are audible.
  • menu objects 710a-710c have even distribution when the objects are not active, e.g. when the mobile station is pointed upwards or moved fast.
  • the target area of the object 720b expands to both directions reaching 1 ,9 times larger target area. Accordingly, the target area of the other objects 720a, 720c shrink.
  • the dynamic zoom of the target area reduces undesired jumping between objects and facilitates the object selection with bigger number of objects.
  • the bottom of the figure is one example how to the positions of the numbers 730 and letters 740 can be implemented around the user in the static auditory menu.
  • figure 9 is disclosed a complex three dimensional structure, wherein the menu objects are positioned on several elevated horizontal levels 920, 930, enabling the simultaneous access of the object groups.
  • the special menu objects 940 of the auditory dial menu are grouped on the upper part of the sphere 910 and numbers locates on the horizontal plane 930.
  • Figure 10 discloses a one example of a mobile device 1000 adapted to control an application.
  • the mobile device comprises processor 1010 for performing instructions and handling data, a memory unit 1020 for storing data such as instructions and application data, a user interface 1030, which can be e.g. single button, a keyboard, touchpad, or other selection means, data transfer means 1040 for transmitting and receiving data, means 1050 for determining the orientation of the mobile device such as accelerometers, and a loudspeaker 1070.
  • the mobile device can also comprise a display 1060 for providing a graphical or haptic feedback, but not necessary.
  • Memory 1020 stores at least an auditory user interface application 1022, acceler- ometer application 1024, and synthesizer application 1026.
  • the implemented accelerometers obtain acceleration data, which the processor 1010 manipulates according to the instructions of the corresponding application 1024, and the synthesizer 1026 converts obtained commands from text format to speech.
  • Figure 11 discloses a device, which is controlled through an air interface by a mobile device.
  • a mobile device can be e.g. a mobile station, computer, laptop, DVD recorder etc.
  • the device 1100 comprises a processor 1110 for performing instructions and handling data, a memory 1120 for storing data such as instructions and application data, a user interface 1130 comprising e.g. touchpad, keyboard, or one or more buttons, at least data receiving means 1140 for receiving data, and a display 1150.
  • the device 1100 can comprise data transmitting means 1142 for sending data to a loudspeaker (not shown), but not necessary.

Abstract

The invention relates to a method, which comprises obtaining (308) information relating to an orientation of a mobile device, associating (312) said information relating to the orientation of the mobile device to a command for executing a function in an application, and providing (314) said command to the application for executing said function in said application. The invention also relates to a mobile device and computer program performed in the mobile device, which execute a function in an application.

Description

METHOD AND DEVICE FOR CONTROLLING AN APPLICATION
TECHNICAL FIELD
The invention relates to a method and mobile device for executing a function in an application. The invention also relates to a method and device for executing an application controlled by a mobile device.
BACKGROUND
Nowadays people mobility seems to increase non-stop as well as their need for information acquisition and game playing.
Previously, the majority of Internet browsing and game playing was performed at home, library, Internet cafe, or work place in front of a table, which carries a personal computer having a network connection and game applications, but modern people's opportunity and compulsion to spend their free time in one place diminishes throughout. On the other hand, the need for information acquisition and game playing maintains outside the home. Consequently, a problem relating to carrying a device capable of Internet browsing and game playing is realised.
Mobile devices, and especially mobile stations and personal digital assistants, (PDAs) have offered a good foundation for developing such devices, which can be carried all the time, which are available outside the home, and which are capable of Internet browsing or game playing irrespective of a user's location, in order to solve the aforesaid problem. In addition, the mobile stations and PDAs are widely used and those are carried by people almost constantly due to their versatile characters. So, the mobile devices play an important role in people's everyday life.
The present mobile devices have user interfaces for inputting data and instructions such as a keyboard, function buttons, a tracking ball, a small "joystick" in the middle of the keyboard, and touchpad, but theirs usability for device or application controlling are limited since theirs usage requires user's both hands when he/she controls an application. Thus, it is impossible to take notes together with the application control. In addition to the two-handed usage, the user's stance can be quite uncomfortable if he/she is unable to reach a static stance, e.g. by sitting in a chair. Overall, the application control by the above-mentioned user interfaces of the mo- bile device is very troublesome if e.g. the user has to move or he/she travels by a congested bus, subway, or train during rush hour
SUMMARY
One object of the invention is to provide a method for executing a function in an application, mobile device enabling to execute a function in an application, a method for executing an application controlled by a mobile device, and a device enabling to execute an application controlled by a mobile device.
The object of the invention is fulfilled by providing a method, wherein information relating to an orientation of a mobile device is obtained, said information relating to the orientation of the mobile device is associated to a command for executing a function in an application, and said command to the application for executing said function is provided in said application.
The object of the invention is also fulfilled by providing a mobile device, which comprises means for obtaining information relating to an orientation of said device, means for associating said information relating to the orientation of the mobile device to a command for executing a function in an application, and means for providing said command to the application for executing said function in said application.
The object of the invention is also fulfilled by providing a method, wherein a com- mand relating to an orientation of a mobile device controlling an application for executing a function in said application is received, said command is executed in said application, and a feedback relating to said received command is provided.
The object of the invention is also fulfilled by providing a device, which comprises means for receiving a command relating to an orientation of a mobile device con- trolling an application for executing a function in said application, means for executing said command in said application, and means for providing a feedback relating to said received command.
According to an embodiment of the invention an auditory user interface is controlled by a user who points and tilts a mobile station in his/her hand to different di- rections for browsing a spherical ego-centric auditory menu comprising one or more auditory objects. Targeted auditory menu objects are indicated with speech or other sounds and reproduced from corresponding directions with a three dimensional audio. The synthesised targeted auditory menu objects are transmitted from the mobile station, which possibly does not comprise a display at all, to user's headphones so that a three dimensional auditory space is established around the user.
An embodiment of the present invention relates to a method according to inde- pendent claim 1.
In addition, an embodiment of the present invention relates to a mobile device according to independent claim 17.
Furthermore, an embodiment of the present invention relates to a method according to independent claim 32.
Also, an embodiment of the present invention relates to a device according to independent claim 37.
Further embodiments are defined in dependent claims.
According to an embodiment of the invention a method comprises obtaining information relating to an orientation of a mobile device capable of transmitting and re- ceiving information, e.g. a mobile station or personal digital assistant (PDA), associating said information relating to the orientation of the mobile device to a command, which executes a function such as the selection of the user interface or game object, the movement of the object, or the copying or pasting of the object etc. in an application, and providing said command to the application for executing said function in said application. The application can be e.g. the user interface, voting application, or game of the mobile device or other remote device such as a mobile station, computer, digital video disc (DVD) device, set-top box, video recorder, or sound reproduction equipment.
The term "orientation of a mobile device" refers situation wherein a mobile device is in user's hand and it is tilted or pointed to a certain direction by means of the motion of the user's wrist, whereupon it is achieved a certain orientation to said mobile device.
According to an embodiment of the invention the method, which is disclosed in the previous embodiment, further comprises establishing a connection to said applica- tion, wherein the user of the mobile device establishes the connection to the application for enabling to carry out a desired function. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said connection to the application is a wireless connection such as a radio frequency connection (e.g. Bluetooth, Wi- bree, and WiFi), or infrared connection.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said obtaining information relating to the orientation of the mobile device comprises determining information relating to a tilt angle and tilt direction of the mobile device, when it is in a user's hand and the mobile device is pointed to a certain direction or rotated by the natural wrist movement of the user (as a traditional joystick), from accelerometers implemented into the device. It uses accelerometers along three axes in order to sense the orientation of the mobile device relative to the direction of the earth's gravity. So, a mapping from Cartesian axis (x, y, z) to spherical polar coordinates (azimuth and elevation) provides easy way to determine a position on the surface of the sphere and this is utilised in context of user interfaces comprising a spherical menu. At present, it is usual that sophisticated mobile devices are equipped with three accelerometers.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said obtaining information relating to the orientation of the mobile device further comprises determining whether a key or button of the mobile device, which comprises one button (key) and does not comprise any keyboard or display in minimum, has been pressed for selecting the function to be executed. Naturally, the determination relates also to a double- clicking, a touchpad pushing, mobile device squeezing, mobile device shaking etc.
Due to the above, the term "button" also refers to a touchpad, device squeezing, and device shaking which can provide, as well as the button, information relating to a function selection.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said associating information relating to the orientation of the mobile device to the command for executing the function comprises deducing the command, which is directed to a controlled application, for executing the desired function on the grounds of the tilt angle and tilt direction information relating to an orientation of the mobile device in a user's hand and the button pressing information, which can comprise an indication relating to a realised at least one button click or an interfered button click. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said providing the command to the application for executing said function comprises sending the command from a mobile device unit processing the association to said application so that the appli- cation is able to carry out the desired operation according to the transferred command.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said method further comprises receiving an audio feedback, a visual feedback, or a haptic feedback relating to the sent command from said application.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said application is provided in another device, e.g. a mobile station, a computer, digital video disc device, set-top box, video recorder, or sound reproduction equipment, and said connection estab- lishment to the application and said command transfer to the application is provided through an air interface between the mobile device and said another device by means of e.g. a radio frequency or infrared.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said another device is a device for playing a game, wherein a feedback is provided visually and/or audibly. The visual feedback is provided either the display of the game device or the display of the mobile device.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said application, which receives a command produced by the movement (and possible one or more button clicks) of the mobile device, is provided in said mobile device.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said application is a three dimensional auditory interface comprising a spherical auditory menu, which comprises one or more auditory objects in an auditory space. The spherical auditory menu is also possible to provide visually on the display of the mobile device for illustrating the auditory objects on the surface of the spherical menu. The auditory objects (sounds) can also be somewhere else than on the sphere surface, e.g. different distances from the user. By the way, the geometry of the visual menu can be e.g. a cylinder or a cube instead of said sphere. Therefore, it is easy for a user to follow the browsing process of the menu since the visual spherical menu comprises selectable (auditory) objects and a cursor representing the movement of the mobile station in the user's hand. These auditory objects enable basic use of a mobile station, e.g. dialling, sending a short message service (SMS) messages, a phonebook browsing, a playlist browsing, playback control, and a radio station surfing. Visually and/or audibly the auditory objects comprises letters, numbers, commands enabling to execute function(s), links, (sub)menus etc.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said associating information relating to the orientation of the mobile device to the command to be transferred to an application comprising auditory objects on at least a spherical auditory menu for executing the function comprises associating information relating to the orientation of the mobile device to an auditory object information, which describes the pointed or button-click selected auditory object.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said providing the command to the application for executing said function further comprises sending the auditory object information to the application for receiving a feedback relating to said auditory object information, said feedback is provided by headphones or a loudspeaker system, which has a wireless connection to the mobile device. So, when the command is established based on auditory object information, it is transferred for the use of the application, which synthesised the command to a speech or other sounds such as an earcorn, spearcorn, auditory icon, and mixing speech, and re- produced it to a user. Secondly, it is possible to play recorded samples instead of said synthesisation. The selected function can be e.g. writing letter "A", which is first pointed from an auditory menu by the mobile station and then selected by a button click.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said feedback through a headphones or a loudspeaker system is provided by using a three dimensional sound so that a targeted or selected objects are reproduced from the correct directions of the targeted or selected objects in a three dimensional audio space. Naturally, it is possible to use mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said spherical auditory menu is an ego-centric auditory menu, which moves along with a user, or an exo-centhc menu used with head-tracking.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises receiving, by a device such as a mobile station, a computer, digital video disc (DVD) device, set-top box, video recorder, or sound reproduction equipment, a command relating to an orientation of a mobile device controlling an application carried out by the device for executing a desired function such as the selection of the user interface or game object, the movement of the object, or the copying or pasting of the object etc. in said application, executing said command in said application, and providing a feedback relating to said received command. The application can be e.g. a user interface, voting application, or game. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said method further comprises establishing a wireless connection with said mobile device, wherein a mobile device user establishes the connection for enabling to carry out a desired function in the application of the device. According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises that an application controlled by a mobile device through an air interface is performed in a device.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, further comprises that said providing the feedback relating to the received command further comprising transferring the caused feedback to the display of the device or mobile device or headphones of the mobile device user, a loudspeaker system connected to the device, or another loudspeaker system reaching the mobile device user for reproducing said feedback.
According to an embodiment of the invention the method, which is disclosed in any of the previous embodiments, comprises that said feedback illustrating a received command is sent through an air interface to said display, headphones or loudspeaker system. BRIEF DESCRIPTION OF THE DRAWINGS
Next, the aspects of the invention will be described in greater detail with reference to exemplary embodiments in accordance with the accompanying drawings, of which
Figure 1 illustrates an exemplary view of an arrangement comprising a mobile device and a controlled device according to an advantageous embodiment of the invention,
Figure 2 illustrates an exemplary view of another arrangement comprising a mobile device and an auditory user interface according to an advanta- geous embodiment of the invention,
Figure 3 illustrates an exemplary flowchart of a method according to an advantageous embodiment of the invention,
Figure 4 illustrates an exemplary flowchart of another method according to an advantageous embodiment of the invention,
Figure 5 illustrates an exemplary view of a control method of a visualised auditory interface according to an advantageous embodiment of the invention,
Figure 6 illustrates an exemplary view of orientation information obtaining according to an advantageous embodiment of the invention,
Figure 7 illustrates an exemplary view of a method for expanding a target area according to an advantageous embodiment of the invention,
Figure 8 illustrates an exemplary view of another method for expanding a target area according to an advantageous embodiment of the invention,
Figure 9 illustrates an exemplary view of a third method for expanding a target area according to an advantageous embodiment of the invention,
Figure 10 illustrates an exemplary view of a mobile device according to an advantageous embodiment of the invention, and
Figure 11 illustrates an exemplary view of a remote device according to an advantageous embodiment of the invention. DETAILED DESCRIPTION
Figure 1 illustrates an arrangement 100, wherein a mobile station 110, which comprises three accelerometers for enabling to determine the orientation of the mobile station 110, communicates with a device 120 such as a desktop computer having a display 130, keyboard 140, and loudspeakers 150a, 150b by using a Bluetooth or infrared connection. The computer 120 runs a graphical user interface comprising e.g. a cursor, graphical icons, and tool bar that are displayed on the display 130.
If the user, who has the mobile station 110 capable of communicating with the computer 120 wirelessly, wants to open a network browser such as Internet Explorer or Modzilla Firefox for searching information relating to a Finnish town called Mikkeli, he/she handles the mobile station 110 in his/her hand same way as a joystick by tilting it to a certain direction for moving the cursor on the display 130 towards the network browser icon and, eventually, on the browser icon. When the cursor is on the browser icon, the user presses the button of the keyboard 160 in the mobile station 110 for opening the network browser. Then, if the loudspeakers 150a, 150b are in an active state, they reproduce a sound indicating the button clicking. Correspondingly, when the browser window has been opened, the user moves the cursor by pointing the mobile station 110 to a suitable direction on the address field of the browser window, activates the address field by clicking a mobile station button, and writes a desired address, e.g. www.mikkeli.fi by means of the mobile station 110 for displaying an Internet site providing information relating to Mikkeli.
The mobile station 110 comprising accelerometers for enabling to determine the orientation can also be utilised as the remote controller of the television (not shown), which visually illustrates a sphere menu on the display of the television.
The mobile station 110 is used for e.g. channel selecting or volume level adjusting.
On the other hand, it is possible that a computer program, e.g. a game application, which the computer 120 runs, is displayed on the display 170 of the mobile station and sounds relating to the game application are reproduced through the loudspeaker of the mobile station (not shown) or headphones (not shown) connected to the mobile station 110. The connection between the mobile station 110 and the headphones is either through a wire or wireless. Moreover, it is possible that the game application is displayed on the computer display 130 and game sounds are reproduced through the mobile station loudspeaker or headphones connected to the mobile station 110.
In figure 2 is illustrated another arrangement 200, wherein a mobile station 210 comprising means for determining the orientation of the mobile station such as three accelerometers implemented into the mobile station 210. The processor (not shown) runs a program application, e.g. an auditory user interface application, in the mobile station 210 for providing the auditory user interface comprising one or more auditory objects 220a-220e to the user. The auditory user interface enables that the mobile station 210 comprises only a single button 230 and possibly a dis- play, not any keyboard. The mobile station 210 provides a auditory space around the user by means of a mobile station loudspeaker (not shown), headphones 240 (or one earphone) having a wired or wireless connection to the mobile station 210, or a loudspeaker comprising at least one loudspeaker (not shown) system around the user. The loudspeaker system is also connected to the mobile station 210 through a wireless or wired connection.
If the user, who has the mobile station 210 in his/her hand, wants to select an auditory object 220a illustrating an icon, which provides an access to a phonebook, he/she points the mobile station 210 to direction, in this case to the left, wherein the auditory icon 220a locates. Since the user can not see the icon 220a, the pointing is based on a user's estimation. Sounds aid to find the icon and help to remember the location. On the other hand, if the mobile device 210 comprises the display, he/she has a better knowledge relating to the location of the icon.
The user interface application provides an audio feedback through the mobile station loudspeaker, headphones 240, one earphone, or loudspeaker system, when the mobile station in the user's hand targets to the desired auditory icon 220a. Also, the feedback is possible to provide by using a three dimensional sound, whereupon the feedback comes from the direction wherein the targeted icon locates. Of course, it is possible to use mono or stereo sound instead of the three dimensional sound in order to provide the feedback to the user. The feedback noti- fies the user that there is possibility to access to the phonebook by selecting said icon 220a. Then, the user clicks the button 230 or one suitable key from the keyboard of the mobile station (not shown) or provides the selection other way and by other means for accessing to the phonebook and the application provides another audio feedback, which describes that the selection is made, by means of a fast re- play. After that the user can browse the phonebook for selecting the number of the person to whom he/she wants to call, and make a call by pointing auditory objects 220a-220e in the auditory menu.
The above described way the user can also provide e.g. a SMS message sending, playlist browsing, playback controlling, and radio station surfing.
Figure 3 discloses, by means of an example only, a flow chart describing a method according to one embodiment of the invention.
During the method start-up in step 302, a mobile device and/or an application, such as the user interface executing a method, is turned on and necessary stages before a connection establishment, such as connection set up definition and dif- ferent parameters initialisation, are provided.
In this case, the user defines to which device he/she wants to establish a connection and adjusts different parameters, if needed, for enabling the connection. For example, in the case of a Bluetooth connection between the mobile station and another device this step includes sending from the mobile station inquiry requests for finding an available Bluetooth enabling device, such as a laptop computer, which listens these inquiry requests and respond to it if receives one.
Next, in step 304 the wireless connection between the mobile device and the other device is established according to the defined set up and parameters. In the Bluetooth connection establishment, the mobile station carry out the connection (page) procedure while the laptop, which listens connection requests from the mobile device, is connectable (page scanning). The procedure is targeted and only the laptop responds to the connection procedure.
In step 306, the other device executes its graphical application.
Steps 304 and 306 do not need to be in this order, since the application execution by the other device can be independent from step 304. On the other hand, it is possible that the connection establishment induces the application execution.
If a user points to a certain direction by the mobile device, which comprises means for determining the orientation, in order to move an object such as a cursor on the display of the other device, the orientation information, which comprises e.g. the tilt and angle of the mobile device or the change of the tilt and/or angle, is obtained in step 308. Optionally, the user can only push a key or button without moving the mobile device in order to select or mark e.g. an object as step 310 describes.
In the case, wherein both the movement of the mobile device and the selection lack, the control method is ended in step 320. However, if the information relating to the orientation or the selection is established, the information is associated to a command to be executed in the application of the other device in step 312. This can mean e.g. that the movement information indicated by Cartesian coordinates is used in the application as such or mapped to another coordinate system, such as rectangular or spherical coordinate system, and associated to the command. It is also possible that the mere coordinates establish the command, no matter which coordinate system they belong.
The command is then transferred to the other device and application through the established Bluetooth connection as step 314 teaches.
Secondly, also steps 312 and 314 can be in an opposite order, whereupon the in- formation relating to the orientation or the selection is transferred to the application, wherein it is mapped and associated to a command to be executed.
In step 316 the desired command is executed and its consequences is displayed in the application on the display of the other device or reproduced through e.g. a loudspeaker or loudspeaker system.
In step 318 the user estimates a need for further commands to the application and if there is the need for further commands, it is possible to return to step 308.
Otherwise, when the application controlling is successfully completed, the control method is ended in step 320.
In figure 4 is also illustrated by means of an example only, a flow chart describing a method according to one embodiment of the invention.
Step 402 discloses a method start-up, wherein the mobile device such as a mobile station and its auditory user interface are turned on, and both perform necessary initialisation procedures.
Then, in step 404, the auditory user interface is executed so that it provides an auditory menu comprising auditory objects around the user of the mobile station. The small picture in the left side of figure 5 presents such auditory menu 510 in a visualised form. The auditory menu comprises several objects on the surface of the spherical auditory menu 520. In the figure, these auditory objects are numbers from zero to nine and a "back" command. After the start-up there can be an audi- tory notification, which indicates that the user is now in a main menu including main menu auditory objects, which location in familiar static positions around the user and on the sphere surface. A big object 530 on the sphere surface indicates a direction ("sight") where the mobile station has been targeted. The right side of the figure depicts how auditory objects 540a-540f in the three dimensional auditory space surrounding the user can be accessed by pointing or tilting a mobile station 550, when it is in the user's hand 560. The pointing and tilting are carried out by the natural movement of the user's wrist as a joystick.
Next, back to step 406 in figure 4, where it is estimated whether a user points to a certain direction by the mobile station comprising three accelerometers for deter- mining the orientation in order to target a desired auditory object. The orientation information, which comprises e.g. the tilt and angle of the mobile device or the change of the tilt and/or angle, is obtained in this step.
Secondly, during step 408, the user can just click a button, touch a touchpad, squeeze the mobile station, or shake the mobile station without moving the mobile station for selecting an auditory object.
If the orientation and selection information lack, the control method is ended in step 422.
Besides, if the information relating to the orientation is established, the information is associated in step 410 to a command to be executed in the auditory user inter- face. The aforesaid means that the orientation information indicated by Cartesian coordinates is mapped to a spherical coordinate system and associated to the command. It is also possible that the mere spherical coordinates establish the command. If the selection information exists, it is associated to a command.
In figure 6 it is described one example how the orientation information is obtained. The accelerometers along three axes (x,y,z) obtain the coordinates of the pointing direction, which are converted from the obtained accelerometer data affected by gravity and which are depicted with big objects 620a-620c on a sphere surface 610 in the small picture in the left side of the figure. The acceleration data from one axis varies between 1 and -1 , being 1 when the axis is downwards, 0 when lateral, and -1 when facing up. If the mobile station 630 is exposed to higher accelerations the input data is clamped -1 and 1 to keep the values on the sphere surface.
As one can see from the bigger picture, if the mobile station 630 in a user's hand 640 is tilted to left, it provides data values x=-1 , y=0, and z=0, whereupon the pointing direction can be illustrated on the visualised sphere surface 610 as the object 620a. Other two pointing directions are illustrated same way as the objects 620b and 620c.
In step 412 it is determined whether the command to be executed is converted to speech or other sounds when the mobile station lacks a display. If the command is displayed on the display of the mobile station, then, the control method moves direct to step 416.
During step 414 the command is converted to an audio form by a synthesizer utilising a text to speed algorithm or by playing recorded samples so that the pointing of the auditory object causes a feedback with a normal speech speed and the selection of the auditory object, for one, causes a feedback with the fast speech speed. This differentiates the pointing and the selection for the user without a visual indication in the eyes-free auditory interface. The menu sounds in a three dimensional auditory display are played one by one and the user has the ability to jump to listen to the next object, thus stopping the replay of the previous one. All menu objects can be heard one by one with slow gestural motion. If the user jumps to other side of the menu with fast gesture, no sound is heard until the gesture stops. When the user stops his/her gesture on a specific object, the sound is repeated with appropriate intervals.
Moreover, it is possible to produce the feedback both audibly and visually to the user so that the visual user interface supports the interpretation of the auditory user interface.
Since the command is converted to the speech or other sounds it is transferred in step 416 to means for reproducing an audio feedback such as headphones or a loudspeaker system connected wirelessly to the mobile station.
In step 418 the speech indicating the pointed or selected auditory menu is reproduced from correct direction with a three dimensional audio. The feedback to the user can also be provided by using a mono or stereo sound instead of the three dimensional sound. The user estimates in step 420 a need for further controlling of the auditory user interface and if there is the need for such, one can return to step 406.
If the user interface controlling is successfully completed, the control method is ended in step 422.
The above described control method utilises the auditory user interface. There is in said user interface the auditory menu containing e.g. a dial menu, which enables dialling, listening to, and removing selected numbers, a SMS menu, which contains all letters and a "space" for writing and an object for sending a written message, phonebook, which comprises an alphabetically organised phonebook. The phonebook can include multiple layers, e.g. alphabets and for each of them submenu with names.
The menu browsing happens by moving the "virtual sight", as it is mentioned earlier, amongst the menu objects, which are spread evenly on the surrounding circular space. Depending on the number of the objects the space between them is dy- namically changed.
The static placement of the menu objects suits well for three dimensional menus where the objects are always at known locations. Thus, each object can be accessed directly by pointing or tilting towards the known location and the auditory feedback of the targeted objects reveals if the right object was pointed, and if not the static order of the objects reveals where the desired object will be found since object locations are easy to remember with the longer use of the menu.
The auditory menu objects are separate objects that monitor if the "sight" is in their defined area. Targeted objects send info to other objects in order to manage the positions of the other objects, fade their sound, and adjust their target sizes.
In order to enhance a pointing accuracy, the auditory spherical menu also comprises a dynamically adjusted target sector, where objects are audible. As illustrated in figure 7, menu objects 710a-710c have even distribution when the objects are not active, e.g. when the mobile station is pointed upwards or moved fast. When a menu object 720b is pointed, the target area of the object 720b expands to both directions reaching 1 ,9 times larger target area. Accordingly, the target area of the other objects 720a, 720c shrink. The dynamic zoom of the target area reduces undesired jumping between objects and facilitates the object selection with bigger number of objects. In the bottom of the figure is one example how to the positions of the numbers 730 and letters 740 can be implemented around the user in the static auditory menu.
On the other hand, it is possible to spread objects close to a targeted object with even spacing of 40 degrees (just an example, can be also e.g. 20 or 50 degrees) and place objects further away in groups as one can see from figure 8. On the left side of the figure is represented evenly spread auditory objects and on the right side, for one, an increased spacing in the vicinity of the targeted object and grouped objects further away. When objects are browsed by rotating gesture next object is always 40 degrees forward and the previous one 40 degrees backwards, respectively.
In figure 9 is disclosed a complex three dimensional structure, wherein the menu objects are positioned on several elevated horizontal levels 920, 930, enabling the simultaneous access of the object groups. The special menu objects 940 of the auditory dial menu are grouped on the upper part of the sphere 910 and numbers locates on the horizontal plane 930.
Figure 10 discloses a one example of a mobile device 1000 adapted to control an application. The mobile device comprises processor 1010 for performing instructions and handling data, a memory unit 1020 for storing data such as instructions and application data, a user interface 1030, which can be e.g. single button, a keyboard, touchpad, or other selection means, data transfer means 1040 for transmitting and receiving data, means 1050 for determining the orientation of the mobile device such as accelerometers, and a loudspeaker 1070. The mobile device can also comprise a display 1060 for providing a graphical or haptic feedback, but not necessary.
Memory 1020 stores at least an auditory user interface application 1022, acceler- ometer application 1024, and synthesizer application 1026. The implemented accelerometers obtain acceleration data, which the processor 1010 manipulates according to the instructions of the corresponding application 1024, and the synthesizer 1026 converts obtained commands from text format to speech.
Figure 11 , for one, discloses a device, which is controlled through an air interface by a mobile device. Such device 1100 can be e.g. a mobile station, computer, laptop, DVD recorder etc. The device 1100 comprises a processor 1110 for performing instructions and handling data, a memory 1120 for storing data such as instructions and application data, a user interface 1130 comprising e.g. touchpad, keyboard, or one or more buttons, at least data receiving means 1140 for receiving data, and a display 1150. Furthermore, the device 1100 can comprise data transmitting means 1142 for sending data to a loudspeaker (not shown), but not necessary.
The invention has been now explained above with reference to the aforesaid embodiments and the several advantages of the invention have been demonstrated. It is clear that the invention is not only restricted to these embodiments, but comprises all possible embodiments within the spirit and scope of the invention thought and the following patent claims.

Claims

Claims
1. A method comprising
obtaining information relating to an orientation of a mobile device,
associating said information relating to the orientation of the mobile device to a command for executing a function in an application, and
providing said command to the application for executing said function in said application.
2. The method according to claim 1 , wherein said method further comprises
establishing a connection to said application.
3. The method according to claim 1 or 2, wherein said connection to the application is a wireless connection.
4. The method according to any of claims 1 -3, wherein said obtaining information relating to the orientation of the mobile device comprises determining information relating to a tilt angle and tilt direction of the mobile device received from accelerometers implemented into the device.
5. The method according to any of claims 1 -4, wherein said obtaining information relating to the orientation of the mobile device further comprises determining whether a button of the device has been pressed for selecting the function to be executed.
6. The method according to any of claims 1 -5, wherein said associating information relating to the orientation of the mobile device to the command for executing the function comprises deducing the command for executing the function on the grounds of the tilt angle and tilt direction of the mobile device and the button pressing information.
7. The method according to any of claims 1 -6, wherein said providing the command to the application for executing said function comprises
sending the command to said application.
8. The method according to any of claims 1 -7, wherein said method further comprising
receiving a feedback relating to the sent command from said application.
9. The method according to any of claims 1 -8, wherein said application is pro- vided in another device and said establishing the connection to the application and said sending the command to the application is provided through an air interface between the mobile device and said another device.
10. The method according to any of claims 1 -9, wherein said another device is a device for playing a game.
11. The method according to any of claims 1 , 2, and 4-8, wherein said application is provided in said mobile device.
12. The method according to any of claims 1 , 2, and 4-9, wherein said application is a three dimensional auditory interface comprising a spherical auditory menu, which comprises one or more auditory objects in an auditory space.
13. The method according to any of claims 1 , 2, 4-9, and 12, wherein said associating information relating to the orientation of the mobile device to the command for executing the function comprises
associating information relating to the orientation of the mobile device to an auditory object information.
14. The method according to any of claims 1 , 2, 4-9, 12, and 13, wherein said providing the command to the application for executing said function further comprises
sending the auditory object information to the application for receiving a feedback relating to said auditory object information, said feedback is provided by headphones or a loudspeaker system, which has a wireless connection to the mobile device.
15. The method according to any of claims 1 , 2, 4-9, 12-13, and 14, wherein said feedback is provided by using a three dimensional sound.
16. The method according to any of claims 1 , 2, 4-9, 12-14, and 15, wherein said spherical auditory menu is an ego-centric auditory menu.
17. A mobile device comprising
means for obtaining information relating to an orientation of said mobile device, means for associating said information relating to the orientation of the mo- bile device to a command for executing a function in an application, and
means for providing said command to the application for executing said function in said application.
18. The mobile device according to claim 17, wherein said mobile device further comprises means for establishing a connection to said application.
19. The mobile device according to claim 17 or 18, wherein said connection to the application is a wireless connection.
20. The mobile device according to any of claims 17-19, wherein said means for obtaining information relating to the orientation of the mobile device comprises
means for determining information relating to a tilt angle and tilt direction of the mobile device received from accelerometers implemented into the mobile device.
21. The mobile device according to any of claims 17-20, wherein said means for obtaining information relating to the orientation of the mobile device further com- prises means for determining whether a button of the mobile device has been pressed for selecting the function to be executed.
22. The mobile device according to any of claims 17-21 , wherein said means for associating information relating to the orientation of the mobile device to the com- mand for executing the function comprises
means for deducing the command for executing the function on the grounds of the tilt angle and tilt direction of the mobile device and the button pressing information.
23. The mobile device according to any of claims 17-22, wherein said means for providing the command to the application for executing said function comprises
means for sending the command to said application.
24. The mobile device according to any of claims 17-23, wherein said mobile de- vice further comprising
means for receiving a feedback relating to the sent command from said application.
25. The mobile device according to any of claims 17-24, wherein said connection establishment to the application and said sending of the command to the applica- tion is provided through an air interface between the mobile device and said another device comprising said application.
26. The mobile device according to any of claims 17, 18, and 20-24, wherein said application is provided in said mobile device.
27. The mobile device according to any of claims 17, 18, and 20-24, wherein said application is a three dimensional auditory interface comprising a spherical auditory menu, which comprises one or more auditory objects in an auditory space.
28. The mobile device according to any of claims 17, 18, 20-24, and 27, wherein said means for associating information relating to the orientation of the mobile device to the command for executing the function comprises means for associating information relating to the orientation of the mobile device to an auditory object information.
29. The mobile device according to any of claims 17, 18, 20-24, 27, and 28, wherein said means for providing the command to the application for executing said function further comprises
means for sending the auditory object information to the application for receiving a feedback relating to said auditory object information, said feedback is provided by headphones or a loudspeaker system, which has a wireless connection to the mobile device.
30. The mobile device according to any of claims 17, 18, 20-24, 27, 28, and 29, wherein said feedback is provided by using a three dimensional sound.
31. The mobile device according to any of claims 17, 18, 20-24, 27-29, and 30, wherein said spherical auditory menu is an ego-centric auditory menu.
32. A method comprising
receiving a command relating to an orientation of a mobile device controlling an application for executing a function in said application,
executing said command in said application, and
providing a feedback relating to said received command.
33. The method according to claim 32, wherein said method further comprises
establishing a wireless connection with said mobile device.
34. The method according to claim 32 or 33, wherein said method further comprises
carrying out said application controlled by said mobile device.
35. The method according to any of claims 32-34, wherein said providing the feedback relating to the received command further comprising
sending the feedback to a display or headphones or a loudspeaker system for reproducing said feedback.
36. The method according to any of claims 32-35, wherein said feedback is sent through an air interface to said display or headphones or loudspeaker system.
37. A device comprising
means for receiving a command relating to an orientation of a mobile device controlling an application for executing a function in said application,
means for executing said command in said application, and
means for providing a feedback relating to said received command.
38. The method according to claim 37, wherein said method further comprises
means for establishing a wireless connection with said mobile device.
39. The method according to claim 37 or 38, wherein said method further comprises
means for carrying out said application controlled by said mobile device.
40. The method according to any of claims 32-39, wherein said providing the feedback relating to the received command further comprising
means for sending the feedback to a display or headphones or a loudspeaker system for reproducing said feedback.
41. The method according to any of claims 32-40, wherein said feedback is sent through an air interface to said display or headphones or loudspeaker system.
42. A computer program product comprising code means configured to execute the method according to any of claims 1 -16 or 32-36 when the program is run on a computer.
43. A carrier medium comprising a computer program product according to claim 42.
PCT/FI2009/050856 2008-10-24 2009-10-26 Method and device for controlling an application WO2010046541A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20080591 2008-10-24
FI20080591A FI20080591A0 (en) 2008-10-24 2008-10-24 Gesture-driven interface

Publications (1)

Publication Number Publication Date
WO2010046541A1 true WO2010046541A1 (en) 2010-04-29

Family

ID=39924562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2009/050856 WO2010046541A1 (en) 2008-10-24 2009-10-26 Method and device for controlling an application

Country Status (2)

Country Link
FI (1) FI20080591A0 (en)
WO (1) WO2010046541A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014065595A1 (en) * 2012-10-23 2014-05-01 엘지전자 주식회사 Image display device and method for controlling same

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0447775A (en) * 1990-06-14 1992-02-17 Matsushita Electric Ind Co Ltd Image pickup device
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
JPH10200618A (en) * 1997-01-13 1998-07-31 Kazunori Tsukiki Input device
US6160540A (en) * 1998-01-12 2000-12-12 Xerox Company Zoomorphic computer user interface
WO2001027735A1 (en) * 1999-10-12 2001-04-19 Myorigo Oy Operation method of user interface of hand-held device
GB2358108A (en) * 1999-11-29 2001-07-11 Nokia Mobile Phones Ltd Controlling a hand-held communication device
WO2002068201A2 (en) * 2001-02-28 2002-09-06 Beckmann Juergen Input device, especially for a mobile telephone
DE10233608A1 (en) * 2002-07-24 2004-02-12 Siemens Ag Input device for a terminal
US20050212767A1 (en) * 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
WO2006090197A1 (en) * 2005-02-24 2006-08-31 Nokia Corporation Motion-input device for a computing terminal and method of its operation
WO2007060287A1 (en) * 2005-11-28 2007-05-31 Innohome Oy Remote control system
US20070297625A1 (en) * 2006-06-22 2007-12-27 Sony Ericsson Mobile Communications Ab Wireless communications devices with three dimensional audio systems

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0447775A (en) * 1990-06-14 1992-02-17 Matsushita Electric Ind Co Ltd Image pickup device
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
JPH10200618A (en) * 1997-01-13 1998-07-31 Kazunori Tsukiki Input device
US6160540A (en) * 1998-01-12 2000-12-12 Xerox Company Zoomorphic computer user interface
WO2001027735A1 (en) * 1999-10-12 2001-04-19 Myorigo Oy Operation method of user interface of hand-held device
GB2358108A (en) * 1999-11-29 2001-07-11 Nokia Mobile Phones Ltd Controlling a hand-held communication device
WO2002068201A2 (en) * 2001-02-28 2002-09-06 Beckmann Juergen Input device, especially for a mobile telephone
DE10233608A1 (en) * 2002-07-24 2004-02-12 Siemens Ag Input device for a terminal
US20050212767A1 (en) * 2004-03-23 2005-09-29 Marvit David L Context dependent gesture response
WO2006090197A1 (en) * 2005-02-24 2006-08-31 Nokia Corporation Motion-input device for a computing terminal and method of its operation
WO2007060287A1 (en) * 2005-11-28 2007-05-31 Innohome Oy Remote control system
US20070297625A1 (en) * 2006-06-22 2007-12-27 Sony Ericsson Mobile Communications Ab Wireless communications devices with three dimensional audio systems

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DICKE C. ET AL: "Experiments in Mobile Spatial Audio-Conferencing: Key-based and Gesture-based Interaction", PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION WITH MOBILE DEVICES AND SERVICES (MOBILEHCI 2008), AMSTERDAM, THE NETHERLANDS, 2 - 5 SEPTEMBER, 2008. ACM INTERNATIONAL CONFERENCE PROCEEDING SERIES, NEW YORK: ACM, 2008, 2 September 2008 (2008-09-02) - 5 September 2008 (2008-09-05), AMSTERDAM, THE NETHERLANDS, pages 91 - 100 *
HARMA A. ET AL: "Augmented Reality Audio for Mobile and Wearable Appliances", J. AUDIO ENG. SOC., vol. 52, no. 6, June 2004 (2004-06-01), pages 618 - 639 *
MARENTAKIS G. N. ET AL: "Effects of Feedback, Mobility and Index of Difficulty on Deictic Spatial Audio Target Acquisition in the Horizontal Plane", PROCEEDINGS OF THE SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI 2006), MONTREAL, CANADA, 22 - 27 APRIL, 2006. NEW YORK: ACM, 2006, 22 April 2006 (2006-04-22) - 27 April 2006 (2006-04-27), pages 359 - 368 *
PATENT ABSTRACTS OF JAPAN *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014065595A1 (en) * 2012-10-23 2014-05-01 엘지전자 주식회사 Image display device and method for controlling same
US9600077B2 (en) 2012-10-23 2017-03-21 Lg Electronics Inc. Image display device and method for controlling same

Also Published As

Publication number Publication date
FI20080591A0 (en) 2008-10-24

Similar Documents

Publication Publication Date Title
JP6012636B2 (en) Terminal operation method according to integrated input and portable terminal supporting the same
EP2487575B1 (en) Method and apparatus for area-efficient graphical user interface
AU2012200532B2 (en) Method and apparatus for graphical user interface
US9335790B2 (en) Wearable devices and associated systems
JP5793426B2 (en) System and method for interpreting physical interaction with a graphical user interface
US20130342456A1 (en) Remote control apparatus and control method thereof
TWI613582B (en) Method for reconfiguring user interface objects,touch-sensitive electronic device and non-transitorycomputer-readable storage medium
US20140189506A1 (en) Systems And Methods For Interpreting Physical Interactions With A Graphical User Interface
EP2733628A2 (en) Screen display method and a mobile terminal
US20110087983A1 (en) Mobile communication terminal having touch interface and touch interface method
CN112905071A (en) Multi-function device control for another electronic device
CN104765584A (en) User terminal apparatus and control method thereof
TW201633101A (en) User interface for receiving user input
US10031581B1 (en) Virtual artifacts using mobile devices
KR20100078295A (en) Apparatus and method for controlling operation of portable terminal using different touch zone
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN108829325A (en) For dynamically adjusting the equipment, method and graphic user interface of the presentation of audio output
KR20100097376A (en) Apparatus and method for controlling operation of portable terminal using different touch zone
WO2010116028A2 (en) Method for controlling an apparatus
KR20160057740A (en) Display apparatus and control method thereof
JP2011023040A (en) Input device of content providing device
JP2014002719A (en) Remote control device, display device and method for controlling the same
WO2010046541A1 (en) Method and device for controlling an application
JP6569546B2 (en) Display device, display control method, and display control program
KR102117450B1 (en) Display device and method for controlling thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09821654

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09821654

Country of ref document: EP

Kind code of ref document: A1