US20070115265A1 - Mobile device and method - Google Patents

Mobile device and method Download PDF

Info

Publication number
US20070115265A1
US20070115265A1 US11/284,695 US28469505A US2007115265A1 US 20070115265 A1 US20070115265 A1 US 20070115265A1 US 28469505 A US28469505 A US 28469505A US 2007115265 A1 US2007115265 A1 US 2007115265A1
Authority
US
United States
Prior art keywords
touch
type
spatial
implement
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/284,695
Inventor
Roope Rainisto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to US11/284,695 priority Critical patent/US20070115265A1/en
Assigned to NOKIA CORPORATION reassignment NOKIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAINISTO, ROOPE
Priority to JP2008540713A priority patent/JP2009516284A/en
Priority to EP06809169A priority patent/EP1952223A1/en
Priority to PCT/IB2006/003084 priority patent/WO2007057736A1/en
Priority to KR1020087009103A priority patent/KR20080057287A/en
Publication of US20070115265A1 publication Critical patent/US20070115265A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a method for controlling a mobile communication terminal, a mobile communication terminal and a computer program performing such a method. Specifically, the invention relates to facilitating user input using a touch sensitive display.
  • a touch sensitive display is typically configured to receive input by interaction with a user through a user interface, both by use of a dedicated pointer device (often referred to as a stylus) or simply by the user tapping the screen with a finger tip.
  • a dedicated pointer device often referred to as a stylus
  • a stylus and a finger are quite different pointer devices.
  • the tip of a stylus is smaller and lighter and it allows for more precise input than a human finger.
  • the finger is larger and heavier and does not allow for very precise input, at least in terms of spatial resolution.
  • the finger is always immediately available whereas the stylus typically is required to be extracted from a storage arrangement within or attached to the mobile device and, after being used, replaced in the storage arrangement.
  • An object of the invention is to overcome at least some of the drawbacks relating to the compromise designs of prior art devices as discussed above.
  • a method for controlling a mobile communication terminal comprising a touch sensitive display.
  • the method comprises the steps of sensing a touch on the touch sensitive display, determining a type of implement having provided the sensed touch on the touch sensitive display, where the type of implement is one of at least a blunt type and a pointed type.
  • user interface elements of a first spatial configuration are displayed when the determined type of implement is the pointed type and user interface elements of a second spatial configuration are displayed when the determined type of implement is the blunt type.
  • the first and second spatial configurations may correspond to a respective first and second spatial scale, wherein the first spatial scale is smaller than the second spatial scale.
  • the first and second spatial configurations may also correspond to a respective first and second spatial distribution of user interface elements.
  • the first and second spatial distribution may also comprise a respective first and second number of elements and wherein the first number of elements is larger than the second number of elements.
  • the sensing of a touch may involve providing touch information in the form of at least mechanical pressure information and also involve providing touch information in the form of at least electric resistance information.
  • the touch sensing may also involve providing touch information comprising information regarding spatial distribution of the touch information.
  • the word “touch” is intended to encompass a general concept of being able to determine whether the input is done with a pointed stylus type of implement or a more blunt implement, such as a human finger, and the way of sensing the touch information may differ with technical implementation.
  • Pressure information, electric resistance as well as the spatial distribution, e.g. the size, of the implement used by the user to touch the display may be used and/or a combination of these may be used in combination to determine the “touch”.
  • An example of how to combine pressure information and spatial distribution is by multiplying sensed pressure with an area over which the pressure is sensed.
  • control circuitry of the terminal is configured (i.e. programmed using software components) in such a way that it generates information of a touch on the touch sensitive display in the form of a type of implement used, which indicates whether the tap was done with a pointed implement such as a stylus or with a blunt implement, such as a finger tip.
  • the circuitry will also sense at which position on the display the touch was made. Such information, although typically very useful, is not essential for the invention at hand.
  • the configuration of the user interface elements may change in terms of different spatial scales and different number of elements that are displayed.
  • the elements may vary in size and their locations may vary.
  • a plurality of elements may be grouped together and configured such that, e.g. in a case with input keys, one single displayed key is associated with the group of keys.
  • a user interface style is achieved that provides the user interface with flexibility based on whether the user is currently tapping the screen with a pointed implement, such as a stylus, or a blunt implement, such as a finger, without requiring any separate user setting or mode switching between stylus and finger user interface modes.
  • a pointed implement such as a stylus
  • a blunt implement such as a finger
  • user interface functionality is provided that supports both stylus and finger use without a need to specify separate modes of operation in one and the same device.
  • This is advantageous in a number of ways, including the fact that it is usable in a wide range of user interface situations, it is totally modeless, i.e. there is no need to user to switch between stylus and finger modes, and it is totally transparent, i.e.
  • the invention makes the terminal stylus-independent in that there is no need for a dedicated stylus having a certain mechanical system to distinguish between stylus and finger use (In fact, some already existing styluses use for instance a magnet/electrical element in the tip of the stylus that the display circuitry then detects and interacts with.).
  • the invention provides a system and a computer program having features and advantages corresponding to those discussed above.
  • FIG. 1 shows schematically a block diagram of a communication terminal according to one embodiment of the present invention.
  • FIG. 2 is a flow chart illustrating a number of steps of a method according to one embodiment of the present invention.
  • FIGS. 3 a - c illustrate the appearance of user interface elements on a display of a terminal during operation of the method of FIG. 2 .
  • FIG. 1 illustrates schematically a communication terminal 101 in which an embodiment of the present invention is implemented.
  • the terminal 101 is capable of communication via an air interface 103 with a radio communication system 105 such as the well known systems GSM/GPRS, UMTS, CDMA 2000, etc.
  • the terminal comprises a processor 107 , memory 109 as well as input/output units in the form of a microphone 111 , a speaker 113 , a touch sensitive display 115 and a keyboard 117 .
  • the touch sensitive display 115 comprises appropriate touch sensing means, such as electronic sensing circuitry 116 , configured to sense touch by way of, e.g., a pointed stylus as well as a finger tip.
  • the circuitry 116 may be configured to sense variations in any one or more of mechanical pressure, electric resistance and spatial distribution of the touch.
  • actuation of a touch sensitive display 115 with a pointed implement generally provides more mechanical pressure, less electrical resistance and less spatial distribution than actuation by a blunt implement under the same actuation conditions.
  • Radio communication is realized by radio circuitry 119 and an antenna 121 . The details regarding how these units communicate are known to the skilled person and are therefore not discussed further.
  • the communication terminal 101 may for example be a mobile telephone terminal or a PDA equipped with radio communication means.
  • the method according to the present invention will in general reside in the form of software instructions, together with other software components necessary for the operation of the terminal 101 , in the memory 109 of the terminal. Any type of conventional removable memory is possible, such as a diskette, a hard drive, a semi-permanent storage chip such as a flash memory card or “memory stick” etc.
  • the software instructions of the inventive notification function may be provided into the memory 109 in a number of ways, including distribution via the network 105 from a software supplier 123 .
  • the program code of the invention may also be considered as a form of transmitted signal, such as a stream of data communicated via the Internet or any other type of communication network, including cellular radio communication networks of any kind, such as GSM/GPRS, UMTS, CDMA 2000 etc.
  • FIGS. 2 and 3 a - c a method according to one embodiment of the invention will be described in terms of a number of steps to be taken by controlling software in a terminal such as the terminal 101 described above in connection with FIG. 1 .
  • the exemplifying method starts at a point in time when a user interface element in the form of an input text field 305 is displayed on a touch sensitive display 303 of a terminal 301 .
  • any amount of displayed information may also be present on the display 303 as indicated by schematically illustrated dummy content 307 .
  • a touch action, e.g. tapping, performed by a user on the input text field 305 is sensed in a sensing step 201 .
  • the sensing is realized, as discussed above, in a touch sensing means, such as sensing circuitry connected to the display 301 (cf. sensing circuitry 116 in FIG. 1 ).
  • a type of implement used by the user when performing-the sensed touch is determined.
  • two types of implements are distinguished: a pointed implement, such as a stylus, and a more blunt implement, such as a finger tip.
  • a pointed implement need not necessarily include a distal end that is perfectly pointed, and the blunt implement need not include a distal end that is completely blunt. Instead, the pointed implement is merely more pointed than the blunt implement, and the blunt implement is more blunt than the pointed implement.
  • the determination of the type of implement is typically performed by determining means that is generally implemented by computer instructions stored in a memory device, such as memory 109 , and executed by processor 107 .
  • a selection step 205 the determined type of implement is used to select between two alternatives for presenting subsequent user interface elements on the display 303 .
  • the selection of the manner of presentation of the user interface elements is typically performed by control means that is generally implemented by computer instructions stored in a memory device, such as memory 109 , and executed by processor 107 .
  • a user interface having elements of a spatially small scale is displayed in a display step 207 .
  • a text output field 311 is also indicated in which any subsequent user input (i.e. results due to tapping on the displayed keyboard 309 ) is to be displayed during a continuation as indicated by reference numeral 211 .
  • a user interface having elements of a spatially large scale is displayed in a display step 209 .
  • FIG. 3 c user interface elements in the form of a keyboard 313 is displayed having a large spatial scale and comprising a smaller number of individual user interface elements (i.e. keypad keys), in comparison with the case of a small scale user interface.
  • large and small spatial scales are relative terms with the large spatial scale merely being larger than the small spatial scale.
  • a text output field 315 is also indicated in which any subsequent user input (i.e. results due to tapping on the displayed keyboard 323 ) is to be displayed during a continuation as indicated by reference numeral 211 ′.
  • keyboard keys having different spatial scales and different locations on the display 303
  • other elements are also possible, such as user interface elements in the forms of scroll bars, editing windows, dialog boxes etc.
  • a plurality of elements may be grouped together and configured such that, e.g. in a case with input keys, one single displayed key is associated with the group of keys.
  • the user interface can display the user interface elements in accordance with various other spatial configurations depending upon the type of implement with spatial configurations that require more precise input being provided in response to the detection of a pointed implement and spatial configurations that have greater tolerance in terms of the acceptable input being provided in response to the detection of a blunt implementation.
  • the user interface can display user interface elements in accordance with different spatial distributions with the spatial distribution resulting from the detection of a pointed implement being less such that the user interface elements are positioned more closely to the neighboring user interface elements than the spatial distribution resulting from the detection of a blunt implement in which the spatial distribution is greater such that the user interface elements are more widely spaced apart from one another.

Abstract

A method of controlling a mobile communication terminal comprises the steps of sensing (201) a touch on a touch sensitive display, determining (203) a type of implement having provided the sensed touch on the touch sensitive display, where the type of implement is one of at least a blunt type and a pointed type. Depending on the determined type of implement, user interface elements of a first spatial configuration are displayed (207) when the determined type of implement is the pointed type and user interface elements of a second spatial configuration are displayed (209) when the determined type of implement is the blunt type.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a method for controlling a mobile communication terminal, a mobile communication terminal and a computer program performing such a method. Specifically, the invention relates to facilitating user input using a touch sensitive display.
  • BACKGROUND
  • Present day mobile devices such as mobile phones are often equipped with display screens that are combined with a transparent touch sensitive layer. Such an arrangement, which typically is referred to as a touch sensitive display, is typically configured to receive input by interaction with a user through a user interface, both by use of a dedicated pointer device (often referred to as a stylus) or simply by the user tapping the screen with a finger tip.
  • Needless to say, a stylus and a finger are quite different pointer devices. The tip of a stylus is smaller and lighter and it allows for more precise input than a human finger. The finger is larger and heavier and does not allow for very precise input, at least in terms of spatial resolution. On the other hand, the finger is always immediately available whereas the stylus typically is required to be extracted from a storage arrangement within or attached to the mobile device and, after being used, replaced in the storage arrangement.
  • Although it is possible to design and realize a user interface that is suited for either the stylus or the finger, a problem arises due to their incompatibility. That is, the use of a mobile device, such as a cellular telephone, involves a number of different short-term and longer term tasks. Some tasks require only one or two actions by the user, i.e. “taps” on the touch sensitive display, by the user and some tasks require several minutes and dozens of “taps” or “clicks”. Hence, any prior art user interface that is suited to accommodate use by either the stylus or the finger is necessarily a compromise in this regard. This is particularly accentuated when considering small mobile devices having very small display screens, where a compromise is unavoidable regarding the size of displayed user interface elements and the number of displayed user interface elements. Furthermore, requiring the user to “take out the stylus” to provide input via the user interface in order to have the device performing a specific functionality is typically also a major burden, both in the sense that it is time consuming and often quite impractical for the user.
  • When designing mobile devices that support an “always-on” mode and instant use mode, designing for finger input instead of stylus use is a good principle. On the other hand, the functionality for providing the additional precision of stylus use should nevertheless be supported in order to provide a desired flexibility from the viewpoint of the user.
  • Ways to bridge the gap between stylus and finger user interface functionality is hence desirable, so that one single user interface would suit both types of functionality properly. Attempts to bridge such a gap have been made by providing designs of user interfaces that are compromises in that they, e.g., support stylus input and provide separate hardware keys that allow selection of user interface elements without tapping the screen, or by providing designs for finger input (the Myorigo device for example) or by allowing the user to scale and zoom the user interface elements as desired.
  • SUMMARY OF THE INVENTION
  • An object of the invention is to overcome at least some of the drawbacks relating to the compromise designs of prior art devices as discussed above.
  • Hence, in a first aspect there is provided a method for controlling a mobile communication terminal comprising a touch sensitive display. The method comprises the steps of sensing a touch on the touch sensitive display, determining a type of implement having provided the sensed touch on the touch sensitive display, where the type of implement is one of at least a blunt type and a pointed type. Depending on the determined type of implement, user interface elements of a first spatial configuration are displayed when the determined type of implement is the pointed type and user interface elements of a second spatial configuration are displayed when the determined type of implement is the blunt type.
  • The first and second spatial configurations may correspond to a respective first and second spatial scale, wherein the first spatial scale is smaller than the second spatial scale. The first and second spatial configurations may also correspond to a respective first and second spatial distribution of user interface elements. The first and second spatial distribution may also comprise a respective first and second number of elements and wherein the first number of elements is larger than the second number of elements.
  • The sensing of a touch may involve providing touch information in the form of at least mechanical pressure information and also involve providing touch information in the form of at least electric resistance information. The touch sensing may also involve providing touch information comprising information regarding spatial distribution of the touch information.
  • Hence, the word “touch” is intended to encompass a general concept of being able to determine whether the input is done with a pointed stylus type of implement or a more blunt implement, such as a human finger, and the way of sensing the touch information may differ with technical implementation. Pressure information, electric resistance as well as the spatial distribution, e.g. the size, of the implement used by the user to touch the display may be used and/or a combination of these may be used in combination to determine the “touch”. An example of how to combine pressure information and spatial distribution is by multiplying sensed pressure with an area over which the pressure is sensed.
  • In other words, the control circuitry of the terminal is configured (i.e. programmed using software components) in such a way that it generates information of a touch on the touch sensitive display in the form of a type of implement used, which indicates whether the tap was done with a pointed implement such as a stylus or with a blunt implement, such as a finger tip. Typically, during touch sensing, the circuitry will also sense at which position on the display the touch was made. Such information, although typically very useful, is not essential for the invention at hand.
  • After the sensing of a touch, it is determined that one action is to be performed when the tapping is sensed to have been performed with a pointed implement such as a stylus and another action with when a blunt implement, such as a finger tip, has been used when tapping on the display. The action (view, dialog etc.) in the user interface that is performed when tapping with a stylus has been determined is designed for stylus use, and the action (view, dialog etc.) that is performed when tapping with a finger tip is then designed for finger tip use. For example, the configuration of the user interface elements may change in terms of different spatial scales and different number of elements that are displayed. The elements may vary in size and their locations may vary. Moreover, a plurality of elements may be grouped together and configured such that, e.g. in a case with input keys, one single displayed key is associated with the group of keys.
  • In summary, a user interface style is achieved that provides the user interface with flexibility based on whether the user is currently tapping the screen with a pointed implement, such as a stylus, or a blunt implement, such as a finger, without requiring any separate user setting or mode switching between stylus and finger user interface modes. Hence, information regarding the manner in which the display has been touched is utilized and user interface functionality is provided that supports both stylus and finger use without a need to specify separate modes of operation in one and the same device. This is advantageous in a number of ways, including the fact that it is usable in a wide range of user interface situations, it is totally modeless, i.e. there is no need to user to switch between stylus and finger modes, and it is totally transparent, i.e. there is no need to provide an on-screen or hardware control to switch between modes. The invention makes the terminal stylus-independent in that there is no need for a dedicated stylus having a certain mechanical system to distinguish between stylus and finger use (In fact, some already existing styluses use for instance a magnet/electrical element in the tip of the stylus that the display circuitry then detects and interacts with.).
  • In other aspects, the invention provides a system and a computer program having features and advantages corresponding to those discussed above.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
  • FIG. 1 shows schematically a block diagram of a communication terminal according to one embodiment of the present invention.
  • FIG. 2 is a flow chart illustrating a number of steps of a method according to one embodiment of the present invention.
  • FIGS. 3 a-c illustrate the appearance of user interface elements on a display of a terminal during operation of the method of FIG. 2.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present inventions now will be described more fully hereinafter with reference to the accompanying drawings, in which some examples of the embodiments of the inventions are shown. Indeed, these inventions may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will satisfy applicable legal requirements. Like numbers refer to like elements throughout.
  • FIG. 1 illustrates schematically a communication terminal 101 in which an embodiment of the present invention is implemented. The terminal 101 is capable of communication via an air interface 103 with a radio communication system 105 such as the well known systems GSM/GPRS, UMTS, CDMA 2000, etc. The terminal comprises a processor 107, memory 109 as well as input/output units in the form of a microphone 111, a speaker 113, a touch sensitive display 115 and a keyboard 117. The touch sensitive display 115 comprises appropriate touch sensing means, such as electronic sensing circuitry 116, configured to sense touch by way of, e.g., a pointed stylus as well as a finger tip. The circuitry 116 may be configured to sense variations in any one or more of mechanical pressure, electric resistance and spatial distribution of the touch. In this regard, actuation of a touch sensitive display 115 with a pointed implement generally provides more mechanical pressure, less electrical resistance and less spatial distribution than actuation by a blunt implement under the same actuation conditions. Radio communication is realized by radio circuitry 119 and an antenna 121. The details regarding how these units communicate are known to the skilled person and are therefore not discussed further.
  • The communication terminal 101 may for example be a mobile telephone terminal or a PDA equipped with radio communication means. The method according to the present invention will in general reside in the form of software instructions, together with other software components necessary for the operation of the terminal 101, in the memory 109 of the terminal. Any type of conventional removable memory is possible, such as a diskette, a hard drive, a semi-permanent storage chip such as a flash memory card or “memory stick” etc. The software instructions of the inventive notification function may be provided into the memory 109 in a number of ways, including distribution via the network 105 from a software supplier 123. That is, the program code of the invention may also be considered as a form of transmitted signal, such as a stream of data communicated via the Internet or any other type of communication network, including cellular radio communication networks of any kind, such as GSM/GPRS, UMTS, CDMA 2000 etc.
  • Turning now to FIGS. 2 and 3 a-c, a method according to one embodiment of the invention will be described in terms of a number of steps to be taken by controlling software in a terminal such as the terminal 101 described above in connection with FIG. 1.
  • The exemplifying method starts at a point in time when a user interface element in the form of an input text field 305 is displayed on a touch sensitive display 303 of a terminal 301. As the skilled person will realize, any amount of displayed information may also be present on the display 303 as indicated by schematically illustrated dummy content 307.
  • A touch action, e.g. tapping, performed by a user on the input text field 305 is sensed in a sensing step 201. The sensing is realized, as discussed above, in a touch sensing means, such as sensing circuitry connected to the display 301 (cf. sensing circuitry 116 in FIG. 1).
  • In a determination step 203, a type of implement used by the user when performing-the sensed touch is determined. Here, two types of implements are distinguished: a pointed implement, such as a stylus, and a more blunt implement, such as a finger tip. As used herein, a pointed implement need not necessarily include a distal end that is perfectly pointed, and the blunt implement need not include a distal end that is completely blunt. Instead, the pointed implement is merely more pointed than the blunt implement, and the blunt implement is more blunt than the pointed implement. The determination of the type of implement is typically performed by determining means that is generally implemented by computer instructions stored in a memory device, such as memory 109, and executed by processor 107.
  • In a selection step 205, the determined type of implement is used to select between two alternatives for presenting subsequent user interface elements on the display 303. Like the determining means, the selection of the manner of presentation of the user interface elements is typically performed by control means that is generally implemented by computer instructions stored in a memory device, such as memory 109, and executed by processor 107.
  • In a case where the type of implement is determined to be a pointed implement, such as a stylus, a user interface having elements of a spatially small scale is displayed in a display step 207. This is illustrated in FIG. 3 b where user interface elements in the form of a keyboard 309 is displayed having a small spatial scale and comprising a large number of individual user interface elements (i.e. keypad keys). A text output field 311 is also indicated in which any subsequent user input (i.e. results due to tapping on the displayed keyboard 309) is to be displayed during a continuation as indicated by reference numeral 211.
  • In a case where the type of implement is determined in the determination step 203 to be a blunt implement such as a finger tip, a user interface having elements of a spatially large scale is displayed in a display step 209. This is illustrated in FIG. 3 c where user interface elements in the form of a keyboard 313 is displayed having a large spatial scale and comprising a smaller number of individual user interface elements (i.e. keypad keys), in comparison with the case of a small scale user interface. As used herein, large and small spatial scales are relative terms with the large spatial scale merely being larger than the small spatial scale. A text output field 315 is also indicated in which any subsequent user input (i.e. results due to tapping on the displayed keyboard 323) is to be displayed during a continuation as indicated by reference numeral 211′.
  • Although the example above only shows user interface elements in the form of keyboard keys having different spatial scales and different locations on the display 303, other elements are also possible, such as user interface elements in the forms of scroll bars, editing windows, dialog boxes etc. Moreover, a plurality of elements may be grouped together and configured such that, e.g. in a case with input keys, one single displayed key is associated with the group of keys.
  • In addition to or instead of displaying the user interface elements in accordance with larger and smaller scales in response to detecting actuation by blunt and pointed implements, respectively, the user interface can display the user interface elements in accordance with various other spatial configurations depending upon the type of implement with spatial configurations that require more precise input being provided in response to the detection of a pointed implement and spatial configurations that have greater tolerance in terms of the acceptable input being provided in response to the detection of a blunt implementation. For example, the user interface can display user interface elements in accordance with different spatial distributions with the spatial distribution resulting from the detection of a pointed implement being less such that the user interface elements are positioned more closely to the neighboring user interface elements than the spatial distribution resulting from the detection of a blunt implement in which the spatial distribution is greater such that the user interface elements are more widely spaced apart from one another.
  • Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific examples of the embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (19)

1. A method for controlling a mobile communication terminal comprising a touch sensitive display, the method comprising the steps of:
sensing a touch on the touch sensitive display,
determining a type of implement having provided the sensed touch on the touch sensitive display, said type of implement being one of at least a blunt type and a pointed type, and
depending on the determined type of implement, displaying user interface elements of a first spatial configuration when the determined type of implement is the pointed type and displaying user interface elements of a second spatial configuration when the determined type of implement is the blunt type.
2. The method according to claim 1, wherein the first and second spatial configurations correspond to a respective first and second spatial scale and wherein the first spatial scale is smaller than the second spatial scale.
3. The method according to claim 1, wherein the first and second spatial configurations correspond to a respective first and second spatial distribution of user interface elements.
4. The method according to claim 1, wherein the first and second spatial distribution comprises a respective first and second number of elements and wherein the first number of elements is larger than the second number of elements.
5. The method according to claim 1, wherein the step of sensing a touch involves providing touch information in the form of at least mechanical pressure information.
6. The method according to claim 1, wherein the step of sensing a touch involves providing touch information in the form of at least electric resistance information.
7. The method according to claim 5, wherein the step of sensing a touch involves providing touch information comprising information regarding spatial distribution of the touch information.
8. A mobile communication terminal comprising a touch sensitive display and:
touch sensing means for sensing a touch on the touch sensitive display,
determining means for determining a type of implement having provided the sensed touch on the touch sensitive display, said type of implement being one of at least a blunt type and a pointed type, and
control means configured for, depending on the determined type of implement, displaying user interface elements of a first spatial configuration when the determined type of implement is the pointed type and displaying user interface elements of a second spatial configuration when the determined type of implement is the blunt type.
9. The terminal according to claim 8, wherein the first and second spatial configurations correspond to a respective first and second spatial scale and wherein the first spatial scale is smaller than the second spatial scale.
10. The terminal according to claim 8, wherein the first and second spatial configurations correspond to a respective first and second spatial distribution of user interface elements.
11. The terminal according to claim 8, wherein the first and second spatial distribution comprises a respective first and second number of elements and wherein the first number of elements is larger than the second number of elements.
12. The terminal according to claim 8, wherein the touch sensing means comprises means for providing touch information in the form of at least mechanical pressure information.
13. The terminal according to claim 8, wherein the touch sensing means comprises means for providing touch information in the form of at least electric resistance information.
14. The terminal according to claim 12, wherein the touch sensing means comprises means for providing touch information comprising information regarding spatial distribution of the touch information.
15. A computer program product comprising a computer readable medium having computer readable software instructions embodied therein, wherein the computer readable software instructions comprise:
computer readable software instructions capable of sensing a touch on the touch sensitive display,
computer readable software instructions capable of determining a type of implement having provided the sensed touch on the touch sensitive display, said type of implement being one of at least a blunt type and a pointed type, and
computer readable software instructions capable of, depending on the determined type of implement, displaying user interface elements of a first spatial configuration when the determined type of implement is the pointed type and displaying user interface elements of a second spatial configuration when the determined type of implement is the blunt type.
16. The computer program product according to claim 15, wherein the first and second spatial configurations correspond to a respective first and second spatial scale and wherein the first spatial scale is smaller than the second spatial scale.
17. The computer program product according to claim 15, wherein the first and second spatial configurations correspond to a respective first and second spatial distribution of user interface elements.
18. The computer program product according to claim 15, wherein the first and second spatial distribution comprises a respective first and second number of elements and wherein the first number of elements is larger than the second number of elements.
19. The computer program product according to claim 15, wherein the computer readable software instructions that are capable of sensing a touch are further capable of providing touch information in the form of at least one of mechanical pressure information, electric resistance information and spatial distribution of touch information.
US11/284,695 2005-11-21 2005-11-21 Mobile device and method Abandoned US20070115265A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US11/284,695 US20070115265A1 (en) 2005-11-21 2005-11-21 Mobile device and method
JP2008540713A JP2009516284A (en) 2005-11-21 2006-10-23 Improved mobile device and method
EP06809169A EP1952223A1 (en) 2005-11-21 2006-10-23 Improved mobile device and method
PCT/IB2006/003084 WO2007057736A1 (en) 2005-11-21 2006-10-23 Improved mobile device and method
KR1020087009103A KR20080057287A (en) 2005-11-21 2006-10-23 Improved mobile device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/284,695 US20070115265A1 (en) 2005-11-21 2005-11-21 Mobile device and method

Publications (1)

Publication Number Publication Date
US20070115265A1 true US20070115265A1 (en) 2007-05-24

Family

ID=38048327

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/284,695 Abandoned US20070115265A1 (en) 2005-11-21 2005-11-21 Mobile device and method

Country Status (5)

Country Link
US (1) US20070115265A1 (en)
EP (1) EP1952223A1 (en)
JP (1) JP2009516284A (en)
KR (1) KR20080057287A (en)
WO (1) WO2007057736A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080042990A1 (en) * 2006-08-18 2008-02-21 Samsung Electronics Co., Ltd. Apparatus and method for changing input mode in portable terminal
US20080284746A1 (en) * 2007-05-15 2008-11-20 High Tech Computer, Corp. Electronic device
US20090315847A1 (en) * 2008-06-20 2009-12-24 Konica Minolta Business Technologies, Inc. Input apparatus having touch panel operation accepting method, and operation accepting program embodied on computer readable medium
US20100194693A1 (en) * 2009-01-30 2010-08-05 Sony Ericsson Mobile Communications Ab Electronic apparatus, method and computer program with adaptable user interface environment
US20110001708A1 (en) * 2009-07-06 2011-01-06 Peter Sleeman Sensitivity control as a function of touch shape
US20130314365A1 (en) * 2012-05-23 2013-11-28 Adrian Woolley Proximity Detection Using Multiple Inputs
US20140354595A1 (en) * 2008-12-09 2014-12-04 Microsoft Corporation Touch input interpretation
US9032818B2 (en) 2012-07-05 2015-05-19 Nextinput, Inc. Microelectromechanical load sensor and methods of manufacturing the same
US20150332107A1 (en) * 2012-12-24 2015-11-19 Nokia Technologies Oy An apparatus and associated methods
US9459777B2 (en) 2009-08-27 2016-10-04 Samsung Electronics Co., Ltd. Method and apparatus for setting font size in a mobile terminal having a touch screen
US9487388B2 (en) 2012-06-21 2016-11-08 Nextinput, Inc. Ruggedized MEMS force die
US9902611B2 (en) 2014-01-13 2018-02-27 Nextinput, Inc. Miniaturized and ruggedized wafer level MEMs force sensors
US20190332197A1 (en) * 2015-10-30 2019-10-31 Microsoft Technology Licensing, Llc Touch sensing of user input device
US10466119B2 (en) 2015-06-10 2019-11-05 Nextinput, Inc. Ruggedized wafer level MEMS force sensor with a tolerance trench
US10962427B2 (en) 2019-01-10 2021-03-30 Nextinput, Inc. Slotted MEMS force sensor
US11221263B2 (en) 2017-07-19 2022-01-11 Nextinput, Inc. Microelectromechanical force sensor having a strain transfer layer arranged on the sensor die
US11243126B2 (en) 2017-07-27 2022-02-08 Nextinput, Inc. Wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
US11243125B2 (en) 2017-02-09 2022-02-08 Nextinput, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
US11255737B2 (en) 2017-02-09 2022-02-22 Nextinput, Inc. Integrated digital force sensors and related methods of manufacture
US11385108B2 (en) 2017-11-02 2022-07-12 Nextinput, Inc. Sealed force sensor with etch stop layer
US11423686B2 (en) 2017-07-25 2022-08-23 Qorvo Us, Inc. Integrated fingerprint and force sensor
US11579028B2 (en) 2017-10-17 2023-02-14 Nextinput, Inc. Temperature coefficient of offset compensation for force sensor and strain gauge
US11874185B2 (en) 2017-11-16 2024-01-16 Nextinput, Inc. Force attenuator for force sensor

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090006958A1 (en) * 2007-06-29 2009-01-01 Nokia Corporation Method, Apparatus and Computer Program Product for Providing an Object Selection Mechanism for Display Devices
EP2085865A1 (en) 2008-01-30 2009-08-05 Research In Motion Limited Electronic device and method of controlling the same
JP5227715B2 (en) * 2008-09-26 2013-07-03 Necパーソナルコンピュータ株式会社 Portable terminal device, information processing apparatus, and program
JP5212640B2 (en) * 2008-12-18 2013-06-19 シャープ株式会社 Interface device and GUI configuration method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5956020A (en) * 1995-07-27 1999-09-21 Microtouch Systems, Inc. Touchscreen controller with pen and/or finger inputs
US6211858B1 (en) * 1997-09-26 2001-04-03 Ericsson Inc. Method and apparatus for displaying a rotating meter icon on a portable intelligent communications device
US6310610B1 (en) * 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US6317835B1 (en) * 1998-12-23 2001-11-13 Radiant Systems, Inc. Method and system for entry of encrypted and non-encrypted information on a touch screen
US20020080123A1 (en) * 2000-12-26 2002-06-27 International Business Machines Corporation Method for touchscreen data input
US20020118176A1 (en) * 2000-10-03 2002-08-29 International Business Machines Corporation Portable computer with chord keyboard
US6611258B1 (en) * 1996-01-11 2003-08-26 Canon Kabushiki Kaisha Information processing apparatus and its method
US20050212780A1 (en) * 2002-10-22 2005-09-29 Timo Tokkonen Method and arrangement for input mode selection
US20060022960A1 (en) * 2004-07-27 2006-02-02 Yasuyuki Fukushima Input system including position-detecting device
US20060061557A1 (en) * 2004-09-14 2006-03-23 Nokia Corporation Method for using a pointing device
US20060161846A1 (en) * 2002-11-29 2006-07-20 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02299013A (en) * 1989-05-15 1990-12-11 Kyocera Corp Electronic system notebook device
JP2599019B2 (en) * 1990-06-28 1997-04-09 三洋電機株式会社 Pen input device
JPH0468392A (en) * 1990-07-09 1992-03-04 Toshiba Corp Image display device
JPH04127315A (en) * 1990-09-19 1992-04-28 Fujitsu Ltd Personal computer
JPH09231006A (en) * 1996-02-28 1997-09-05 Nec Home Electron Ltd Portable information processor
JPH11110111A (en) * 1997-09-29 1999-04-23 Pfu Ltd Touch panel supporting device
WO1999028811A1 (en) * 1997-12-04 1999-06-10 Northern Telecom Limited Contextual gesture interface
JP2001222378A (en) * 2000-02-10 2001-08-17 Nec Saitama Ltd Touch panel input device
JP2005182487A (en) * 2003-12-19 2005-07-07 Nec Software Chubu Ltd Character input apparatus, method and program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5956020A (en) * 1995-07-27 1999-09-21 Microtouch Systems, Inc. Touchscreen controller with pen and/or finger inputs
US6611258B1 (en) * 1996-01-11 2003-08-26 Canon Kabushiki Kaisha Information processing apparatus and its method
US6211858B1 (en) * 1997-09-26 2001-04-03 Ericsson Inc. Method and apparatus for displaying a rotating meter icon on a portable intelligent communications device
US6310610B1 (en) * 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US6317835B1 (en) * 1998-12-23 2001-11-13 Radiant Systems, Inc. Method and system for entry of encrypted and non-encrypted information on a touch screen
US20020118176A1 (en) * 2000-10-03 2002-08-29 International Business Machines Corporation Portable computer with chord keyboard
US20020080123A1 (en) * 2000-12-26 2002-06-27 International Business Machines Corporation Method for touchscreen data input
US7190348B2 (en) * 2000-12-26 2007-03-13 International Business Machines Corporation Method for touchscreen data input
US20050212780A1 (en) * 2002-10-22 2005-09-29 Timo Tokkonen Method and arrangement for input mode selection
US20060161846A1 (en) * 2002-11-29 2006-07-20 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
US20060022960A1 (en) * 2004-07-27 2006-02-02 Yasuyuki Fukushima Input system including position-detecting device
US20060061557A1 (en) * 2004-09-14 2006-03-23 Nokia Corporation Method for using a pointing device

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9141282B2 (en) * 2006-08-18 2015-09-22 Samsung Electronics Co., Ltd Apparatus and method for changing input mode in portable terminal
US20080042990A1 (en) * 2006-08-18 2008-02-21 Samsung Electronics Co., Ltd. Apparatus and method for changing input mode in portable terminal
US20080284746A1 (en) * 2007-05-15 2008-11-20 High Tech Computer, Corp. Electronic device
US8411043B2 (en) * 2007-05-15 2013-04-02 Htc Corporation Electronic device
US20090315847A1 (en) * 2008-06-20 2009-12-24 Konica Minolta Business Technologies, Inc. Input apparatus having touch panel operation accepting method, and operation accepting program embodied on computer readable medium
US20140354595A1 (en) * 2008-12-09 2014-12-04 Microsoft Corporation Touch input interpretation
US20100194693A1 (en) * 2009-01-30 2010-08-05 Sony Ericsson Mobile Communications Ab Electronic apparatus, method and computer program with adaptable user interface environment
US20110001708A1 (en) * 2009-07-06 2011-01-06 Peter Sleeman Sensitivity control as a function of touch shape
CN101943968A (en) * 2009-07-06 2011-01-12 爱特梅尔公司 Sensitivity control as a function of touch shape
US8451237B2 (en) * 2009-07-06 2013-05-28 Atmel Corporation Sensitivity control as a function of touch shape
US9459777B2 (en) 2009-08-27 2016-10-04 Samsung Electronics Co., Ltd. Method and apparatus for setting font size in a mobile terminal having a touch screen
US20130314365A1 (en) * 2012-05-23 2013-11-28 Adrian Woolley Proximity Detection Using Multiple Inputs
US9459737B2 (en) * 2012-05-23 2016-10-04 Atmel Corporation Proximity detection using multiple inputs
US9487388B2 (en) 2012-06-21 2016-11-08 Nextinput, Inc. Ruggedized MEMS force die
US9493342B2 (en) 2012-06-21 2016-11-15 Nextinput, Inc. Wafer level MEMS force dies
US9032818B2 (en) 2012-07-05 2015-05-19 Nextinput, Inc. Microelectromechanical load sensor and methods of manufacturing the same
US20150332107A1 (en) * 2012-12-24 2015-11-19 Nokia Technologies Oy An apparatus and associated methods
US9902611B2 (en) 2014-01-13 2018-02-27 Nextinput, Inc. Miniaturized and ruggedized wafer level MEMs force sensors
US10466119B2 (en) 2015-06-10 2019-11-05 Nextinput, Inc. Ruggedized wafer level MEMS force sensor with a tolerance trench
US20190332197A1 (en) * 2015-10-30 2019-10-31 Microsoft Technology Licensing, Llc Touch sensing of user input device
US10782802B2 (en) * 2015-10-30 2020-09-22 Microsoft Technology Licensing, Llc Touch sensing of user input device
US11808644B2 (en) 2017-02-09 2023-11-07 Qorvo Us, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
US11604104B2 (en) 2017-02-09 2023-03-14 Qorvo Us, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
US11243125B2 (en) 2017-02-09 2022-02-08 Nextinput, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
US11946817B2 (en) 2017-02-09 2024-04-02 DecaWave, Ltd. Integrated digital force sensors and related methods of manufacture
US11255737B2 (en) 2017-02-09 2022-02-22 Nextinput, Inc. Integrated digital force sensors and related methods of manufacture
US11221263B2 (en) 2017-07-19 2022-01-11 Nextinput, Inc. Microelectromechanical force sensor having a strain transfer layer arranged on the sensor die
US11423686B2 (en) 2017-07-25 2022-08-23 Qorvo Us, Inc. Integrated fingerprint and force sensor
US11609131B2 (en) 2017-07-27 2023-03-21 Qorvo Us, Inc. Wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
US11946816B2 (en) 2017-07-27 2024-04-02 Nextinput, Inc. Wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
US11243126B2 (en) 2017-07-27 2022-02-08 Nextinput, Inc. Wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
US11898918B2 (en) 2017-10-17 2024-02-13 Nextinput, Inc. Temperature coefficient of offset compensation for force sensor and strain gauge
US11579028B2 (en) 2017-10-17 2023-02-14 Nextinput, Inc. Temperature coefficient of offset compensation for force sensor and strain gauge
US11385108B2 (en) 2017-11-02 2022-07-12 Nextinput, Inc. Sealed force sensor with etch stop layer
US11965787B2 (en) 2017-11-02 2024-04-23 Nextinput, Inc. Sealed force sensor with etch stop layer
US11874185B2 (en) 2017-11-16 2024-01-16 Nextinput, Inc. Force attenuator for force sensor
US10962427B2 (en) 2019-01-10 2021-03-30 Nextinput, Inc. Slotted MEMS force sensor
US11698310B2 (en) 2019-01-10 2023-07-11 Nextinput, Inc. Slotted MEMS force sensor

Also Published As

Publication number Publication date
JP2009516284A (en) 2009-04-16
KR20080057287A (en) 2008-06-24
WO2007057736A1 (en) 2007-05-24
EP1952223A1 (en) 2008-08-06

Similar Documents

Publication Publication Date Title
US20070115265A1 (en) Mobile device and method
US7479948B2 (en) Terminal and method for entering command in the terminal
EP1466241B1 (en) Method and apparatus for integrating a wide keyboard in a small device
US20160196027A1 (en) Column Organization of Content
CN103309596B (en) The method of adjustment of a kind of entering method keyboard and mobile terminal thereof
US20110167380A1 (en) Mobile device color-based content mapping and navigation
US20150128081A1 (en) Customized Smart Phone Buttons
US20090172531A1 (en) Method of displaying menu items and related touch screen device
US7825900B2 (en) Method and system for selecting a currency symbol for a handheld electronic device
US10901614B2 (en) Method and terminal for determining operation object
KR20070107462A (en) Portable terminal and method of displaying text using same
CN106951175A (en) The control method and mobile terminal of a kind of input through keyboard
EP2615811A1 (en) Improved mobile communication terminal and method
EP1745348A1 (en) Data input method and apparatus
US20050141770A1 (en) Split on-screen keyboard
JPWO2011093230A1 (en) Portable information terminal and key arrangement changing method thereof
CN101290546A (en) Keyboard and Chinese character input method
WO2007031602A1 (en) Method for selecting character interpretation mode
US20130021260A1 (en) Method for inputting korean character on touch screen
KR20110003130A (en) Method for inputting letter in a mobile phone
CN108052212A (en) A kind of method, terminal and computer-readable medium for inputting word
WO2023045920A1 (en) Text display method and text display apparatus
CN107515681B (en) A kind of character input method, mobile terminal and computer readable storage medium
US20070247394A1 (en) Display menu allowing better accessibility in a limited space
CA2538636C (en) Handheld electronic device having improved word correction, and associated method

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOKIA CORPORATION, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RAINISTO, ROOPE;REEL/FRAME:017412/0886

Effective date: 20060228

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION