US20090160803A1 - Information processing device and touch operation detection method - Google Patents

Information processing device and touch operation detection method Download PDF

Info

Publication number
US20090160803A1
US20090160803A1 US12/326,991 US32699108A US2009160803A1 US 20090160803 A1 US20090160803 A1 US 20090160803A1 US 32699108 A US32699108 A US 32699108A US 2009160803 A1 US2009160803 A1 US 2009160803A1
Authority
US
United States
Prior art keywords
touch
movement amount
touch operation
command
button
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/326,991
Inventor
Hirokazu Hashimoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HASHIMOTO, HIROKAZU
Publication of US20090160803A1 publication Critical patent/US20090160803A1/en
Priority to US13/938,932 priority Critical patent/US10168888B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention contains subject matter related to Japanese Patent Application JP2007-330849 filed in the Japanese Patent Office on Dec. 21, 2007, the entire contents of which being incorporated herein by reference.
  • the present invention relates to an information processing device and a touch operation detection method, which are suitably applied to, e.g., a portable navigation device.
  • a portable navigation device (hereinafter, referred to as “PND”) is designed to be detachably attached to the dashboard of a vehicle via a cradle.
  • the PND of such a type serves as a vehicle navigation device when being attached to the dashboard via a cradle and serves as a personal navigation device when being detached from the dashboard.
  • This type of PND aims more for portability than remote-controllability.
  • a remote controller is not provided for the PND but a user interface that receives a command from a user via a touch panel provided on the front surface of a liquid crystal display is adopted.
  • the PND having such a configuration displays a plurality of destination candidates so as to allow a user to select his or her destination. If the destination candidates exist over a plurality of pages, the user needs to perform a page-turning operation in order to determine the destination.
  • the PND performs a page-turning operation in response to the user's pointing operation using his or her finger in the same manner as in the case of the abovementioned electronic book display control device
  • the PND often erroneously detects that a depression operation with respect to the candidate destination has been made at the moment when the user falsely touched one of the destination candidates with his or her finger, not a page-turning button.
  • the PND cannot correctly reflect the user's intention to perform a pointing operation.
  • the present invention has been made in view of the above points and an object thereof is to propose an information processing device and a touch operation detection method capable of correctly detect a command issued in response to a user's touch operation.
  • an information processing device including: a display unit having a touch panel on its front surface; a movement amount calculation unit for calculating a movement amount of a touch operation based on a touch point at which the touch operation is performed with respect to the touch panel and a touch release point at which the touch operation is released from the touch panel; an operation determination unit for determining whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and a command recognition unit for recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • the information processing device determines whether a touch operation is a button touch operation or a gesture operation depending on the movement amount of the touch operation, thereby correctly recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • a touch operation detection method including: a touch point detection step in which a touch point detection unit detects a touch point at which a touch operation is performed with respect to a touch panel provided on the front surface of a display unit; a touch release point detection step following the touch point detection step, in which a touch release point detection unit detects a touch release point at which the touch operation is released from the touch panel; a movement amount calculation step in which a movement amount calculation unit calculates a movement amount of the touch operation based on the touch point and touch release point; an operation determination step in which an operation determination unit determines whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and a command recognition step in which a command recognition unit recognizes whether a received command is a command corresponding to the depression operation or the gesture operation.
  • the touch operation detection method determines whether a touch operation is a button touch operation or a gesture operation depending on the movement amount of the touch operation, thereby correctly recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • a touch operation is a button touch operation or a gesture operation is determined depending on the movement amount of the touch operation, allowing correct determination on whether a received command is a command corresponding to the depression operation or the gesture operation. Therefore, an information processing device and a touch operation detection method capable of correctly recognizing a command corresponding to a user's touch operation can be realized.
  • FIG. 1 is a perspective view schematically showing the outer appearance of a PND
  • FIG. 2 is a block diagram schematically showing a circuit configuration of the PND
  • FIG. 3 is a view schematically showing a guide map image
  • FIG. 4 is a view schematically showing an application menu screen
  • FIG. 5 is a view schematically showing a book selection screen
  • FIG. 6 is a view schematically showing a password requesting screen
  • FIG. 7 is a view schematically showing a spot selection screen
  • FIG. 8 is a view schematically showing an introduction page screen
  • FIG. 9 is a view schematically showing a detailed page screen
  • FIG. 10 is a flowchart showing a procedure of command recognition processing
  • FIG. 11 is a view schematically showing a touch release operation
  • FIG. 12 is a schematic view for explaining a gesture operation.
  • a reference numeral 1 denotes a portable navigation device (hereinafter, referred to as “PND”) according to the embodiment of the present invention.
  • the PND 1 has a display section 2 on which a 4.8 liquid crystal display is mounted and uses the display section 2 to present information such as a map, a current position icon PST, and a driving route toward a destination.
  • the PND 1 is held by a cradle 3 attached to a dashboard of a vehicle by a sucker 3 A and is electrically connected thereto. In this state, the PND operates by a power supplied from a battery of the vehicle via the cradle 3 . Further, when removed from the cradle 3 , the PND 1 can operate by a power supplied from a battery incorporated therein.
  • a controller 16 having a microcomputer configuration controls the entire operation of the PND 1 according a basic program and executes various navigation processing, command recognition processing to be described later, and the like according to various application programs stores in a hard disk, etc.
  • the PND 1 uses a GPS module 12 to demodulate satellite signals S 1 from a plurality of GPS satellites received via a GPS antenna 11 and, based on orbital data obtained as a result of the demodulation and distance data between the GPS satellites and vehicle, measures the current position of the vehicle with accuracy and transmits the obtained current position data (current position data S 2 ) to a route search section 13 .
  • the orbital data is detailed orbital information (parameter) representing a detailed orbit of each of the GPS satellites.
  • the orbital data needs to be acquired from at least three GPS satellites.
  • the route search section 13 reads out map data S 3 representing the current position of the vehicle and its surrounding area from a map data storage section 14 based on the current position data S 2 , searches the map data S 3 for a driving route from the current position to a destination which has been set by a user, generates a route guide map S 4 including the driving route, and transmits the route guide map S 4 to the display section 2 constituted by a liquid crystal display.
  • the display section 2 displays a guide map image G 1 corresponding to the route guide map data S 4 and thereby allows a user to visually confirm the driving route from a current position icon PST to a destination (not shown).
  • the controller 16 of the PND 1 When the controller 16 of the PND 1 receives a command S 7 which has been issued in response to depression of an application menu button MB displayed in the lower left of the guide map image G 1 via a touch panel 15 , the controller 16 reads out application menu screen data S 5 from a data storage section 17 .
  • the controller 16 of the PND 1 then displays an application menu screen G 2 corresponding to the application menu screen data S 5 on the display section 2 as shown in FIG. 4 .
  • a music button B 1 for reproduction of music a video button B 2 for reproduction of video
  • POI Point Of Interest
  • the controller 16 of the PND 1 When detecting a touch operation made to the guidebook button B 3 in the application menu screen G 2 via the touch panel 15 ( FIG. 2 ), the controller 16 of the PND 1 reads out a book selection screen data S 6 associated with the guidebook button B 3 from the data storage section 17 .
  • the controller 16 of the PND 1 then displays a book selection screen G 3 corresponding to the book selection screen data S 6 on the display section 2 as shown in FIG. 5 .
  • the book selection screen G 3 displays a list of choices of guidebooks such as “100 spots” item KL 1 , “1,000 spots” item KL 2 , “illumination information ⁇ free version” item KL 3 , “Ekiben (train lunch) search ⁇ free version 2)” item KL 4 , and an UP button UB 1 and a down button DB 1 for scrolling the list in the up-down direction.
  • the book selection screen G 3 further displays a “return” button BB 1 .
  • the controller 16 of the PND 1 sets back the display content of the display section 2 from the book selection screen G 3 to application menu screen G 2 .
  • the controller 16 of the PND 1 When detecting, via the touch panel 15 , that the “1,000 spots” item KL 2 on the book selection screen G 3 has been touched by a user, the controller 16 of the PND 1 reads out password requesting screen data S 7 from the data storage section 17 if a password has been set for the “1,000 spots” item KL 2 .
  • the controller 16 of the PND 1 displays a password requesting screen G 4 corresponding to the password requesting screen data S 7 on the display section 2 .
  • the password requesting screen G 4 displays a password input field IP, alphabet keys AK, a numeric key NK, a delete key DK, an entry key EK, as well as, a left button LB 1 and a right button RB 2 for moving a cursor CS of the password input field IP in the left-right direction and a “return” button BB 2 .
  • the controller 16 of the PND 1 sets back the display content of the display section 2 from the password requesting screen G 4 to book selection screen G 3 .
  • the controller 16 of the PND 1 displays a password input in response to a touch operation with respect to the alphabet key AK or numeric key EK in the password input field IP of the password requesting screen G 4 and determines the password displayed in the password input field IP in response to a touch operation with respect to the enter key EK.
  • the controller 16 of the PND 1 When authentication based on the password input to the password input field IP of the password requesting screen G 4 is successful, the controller 16 of the PND 1 reads out spot selection screen data S 8 from the data storage section 17 .
  • the controller 16 of the PND 1 then displays a spot selection screen G 5 corresponding to the spot selection screen data S 8 as shown in FIG. 7 .
  • the spot selection screen G 5 displays, as the POIs which can be candidates of a destination or a stop-off point, a list of spot items SA 1 to SA 4 : “G0001 Jonan Park”, “G0002 Gold Bridge”, “G0003 Port-side Tower”, and “G0004 Yumemigaoka” and a up button UB 2 and a down button DB 2 for scrolling the list in the up-down direction.
  • the spot selection screen G 5 displays a “return” button BB 3 .
  • the controller 16 of the PND 1 sets back the display content of the display section 2 from the spot selection screen G 5 to book selection screen G 3 .
  • the controller 16 of the PND 1 detects via the touch panel 15 that e.g., the spot item SA 1 : “G0001 Jonan Park” displayed on the spot selection screen G 5 has been touched, the controller 16 of the PND 1 reads out introduction page screen data S 8 about “G0001 Jonan Park” from the data storage section 17 .
  • the controller 16 of the PND 1 then displays an introduction page screen G 6 corresponding to the introduction page screen data S 8 on the display section 2 as shown in FIG. 8 .
  • the introduction page screen G 6 displays a guidebook name display field TL 1 , a photo display field GA, a spot item name display field NL 1 , a location specifying information display field AL, a “to-map” button MB 1 for returning to the abovementioned guide map image G 1 ( FIG. 3 ), and a return button BB 4 .
  • the introduction page screen G 6 further displays, in the lower center thereof, a next button NT 1 with an arrow for displaying a next page and a back button BK 1 with an arrow for displaying a previous page.
  • the controller 16 of the PND 1 reads out detailed page screen data S 9 concerning the “Jonan Park” from the data storage section 17 .
  • the controller 16 of the PND 1 then displays a detailed page screen G 7 corresponding to the detailed page screen data S 9 on the display section 2 as shown in FIG. 9 .
  • the detailed page screen G 7 displays a guide book name display field TL 2 , a spot item name display field NL 2 , a detailed content display field DL representing detailed content of the “G0001 Jonan Park”, a “to-map” button MB 2 for returning to the guide map image G 1 ( FIG. 3 ), and a return button BB 5 .
  • the controller 16 of the PND 1 allows a user to visually confirm the detailed content display field DL of the detailed page screen G 7 in the manner as described above. As a result, the user can grasp the detailed content of the “G0001 Jonan Park” and determine whether to set or not the “G0001 Jonan Park” as his or her destination or stop-off point.
  • the detailed page screen G 7 displays, in the lower center thereof, a next button NT 2 with an arrow for displaying a next page and a back button BK 2 with an arrow for displaying a previous page.
  • the controller 16 of the PND 1 sets back the content of the display section 2 to the introduction page screen G 6 concerning the “G0001 Jonan Park”.
  • the controller 16 of the PND 1 can perform the switching operation not only by detecting a user's touch operation with respect to the next buttons NT 1 , NT 2 and back buttons BK 1 , BK 2 , but also by understanding a command corresponding to a finger gesture with respect to the touch panel 15 of the display section 2 .
  • the controller 16 of the PND 1 can perform the page switching processing according to the command. The details of this command recognition processing will be described below.
  • step SP 1 when detecting that a touch operation has been made with respect to the touch panel 15 by a user's finger, the controller 16 proceeds to step SP 2 .
  • step SP 2 when detecting a touch release operation, i.e., detecting that the user's finger has been separated from the touch panel 15 , the controller 16 of the PND 1 proceeds to step SP 3 .
  • step SP 3 the controller 16 of the PND 1 calculates the movement amount of the finger on the display section 2 from the touch point detected in step SP 1 to touch release point detected in step SP 2 and proceeds to step SP 4 .
  • step SP 4 the controller 16 of the PND 1 determines whether the movement amount of the finger calculated in step SP 3 is not more than a predetermined threshold (e.g., 5 mm). When an affirmative result has been obtained, which means that there is little movement, the controller 16 of the PND 1 determines that a gesture operation based on a drag operation of the finger has not been performed and proceeds to step SP 5 .
  • a predetermined threshold e.g. 5 mm
  • the controller 16 of the PND 1 determines, in step SP 5 , whether any button (e.g., next button NT 1 ) exists at the touch release point as shown in FIG. 11 .
  • step SP 5 When a negative result has been obtained in step SP 5 , which means that the finger motion means neither a gesture operation nor a button touch operation, the controller 16 of the PND 1 determines that no command has been input and proceeds to step SP 8 where the controller 16 of the PND 1 ends this flow without doing anything.
  • step SP 5 which means that the finger motion means a button touch operation with respect to, e.g., the next button NT 1
  • the controller 16 of the PND 1 proceeds to step SP 6 .
  • step SP 6 the controller 16 of the PND 1 recognizes that a command corresponding to the button touch operation with respect to the next button NT 1 has been issued and switches the display content from, e.g., the introduction page screen G 6 ( FIG. 8 ) to detailed page screen G 7 ( FIG. 9 ). Then, the controller 16 of the PND 1 proceeds to step SP 8 and ends this flow.
  • step SP 4 When a negative result has been obtained in step SP 4 , i.e., the movement amount of the finger calculated in step SP 3 exceeds a predetermined threshold (e.g., 5 mm), which means that a gesture operation based on a drag operation has been performed such that the finger is dragged from the left to right on the display section 2 as shown in FIG. 12 , the controller 16 of the PND 1 proceeds to step SP 7 .
  • a predetermined threshold e.g., 5 mm
  • step SP 7 the controller 16 of the PND 1 recognizes, e.g., a page-turning command corresponding to the pattern of a gesture operation and, according to the page-turning command, switches the display content from the introduction page screen G 6 ( FIG. 8 ) to detailed page screen G 7 ( FIG. 9 ) in such a manner as if the page of a book were turned over. Then, the controller 16 of the PND 1 proceeds to step SP 8 and ends this flow.
  • a page-turning command corresponding to the pattern of a gesture operation
  • the controller 16 of the PND 1 When recognizing that a gesture operation of a motion pattern in which the finger is dragged from the left to right on the display section 2 has been performed, the controller 16 of the PND 1 performs the abovementioned page-turning operation, while when recognizing that a gesture operation of a motion pattern in which the finger is dragged from the right to left on the display section 2 has been performed, the controller 16 of the PND 1 performs a page turning back operation to set back the display content from the detailed page screen G 7 ( FIG. 9 ) to introduction page screen G 6 ( FIG. 8 ).
  • the controller 16 of the PND 1 When recognizing that a gesture operation of a motion pattern in which the finger is moved so as to draw a triangle has been performed, the controller 16 of the PND 1 recognizes that the motion means a command to search a driving route from the current position to the home of a user and displays the driving route obtained as a result of the search on the display section 2 .
  • the controller 16 of the PND 1 When recognizing that a gesture operation of a motion pattern in which the finger is swirled clockwise has been performed, the controller 16 of the PND 1 enlarges a map currently displayed on the display section 2 , while when recognizing that a gesture operation of a motion pattern in which the finger is swirled counterclockwise has been performed, the controller 16 of the PND 1 reduces the map size.
  • the controller 16 of the PND 1 determines whether a finger motion is a button touch operation or a gesture operation based on a drag operation depending on the movement amount of the finger on the display section 2 from the touch point to touch release point, recognizes a command corresponding to the button touch operation or gesture operation, and performs predetermined processing according to the command.
  • the controller 16 of the PND 1 does not determine whether the finger motion is a button touch operation or a gesture operation until the finger is separated from the button.
  • the controller 16 of the PND 1 does not erroneously determine that the finger motion is a gesture operation but correctly determines that the finger motion is a button touch operation with respect to a button.
  • the controller 16 of the PND 1 correctly determines whether a touch operation is a button touch operation instantaneously made with respect to a button or a gesture operation based on a drag operation to thereby correctly and reliably execute predetermined processing specified by a command corresponding to the button touch operation or gesture operation.
  • a value of 5 mm is used as a threshold of the movement amount of the finger from its touch point to touch release point to determine whether the finger motion is a button touch operation or a gesture operation.
  • the present invention is not limited to this.
  • the value of the threshold may arbitrarily set depending on various factors such as the size of the display area of the display section 2 or size of the button.
  • the pattern of a gesture operation includes the drag-like motion of the finger from the right to left, drag-like motion of the finger from the left to right, motion of the finger such as one by which a triangle is drawn, and clockwise or counterclockwise swirled motion of the finger.
  • the present invention is not limited to this.
  • the pattern of a gesture operation may include a motion of the finger such as one by which “x” or “ ⁇ ” is drawn, a finger tapping motion, and other various types of motions.
  • the controller 16 of the PND 1 determines that the finger motion is a gesture operation based on the movement amount of the finger on the display section 2 from the touch point to touch release point irrespective of whether any button displayed on the display section 2 is touched by the finger.
  • the controller 16 of the PND 1 may determine that the finger motion is a gesture operation based on the movement amount of the finger on the display section 2 where no button is displayed.
  • the controller 16 of the PND 1 executes the command recognition processing procedure ( FIG. 10 ) of the routine RT 1 according to the command recognition processing program which is an application program stored in a hard disk.
  • the command recognition processing procedure may be executed according to a command recognition processing program installed from a program storage medium such as a compact disk (CD) or a semiconductor memory, one downloaded from the Internet, or one installed through other various routes.
  • the PND 1 as an information processing device according to the embodiment of the present invention is constituted by the display section 2 serving as a display unit, and controller 16 serving as a movement amount calculation unit, an operation determination unit, and a command recognition unit.
  • the information processing device according to the embodiment of the present invention may be constituted by the display unit, movement amount calculation unit, operation determination unit, and command recognition unit including other various circuit configurations.
  • the information processing device and touch operation detection method can be applied to various electronic apparatuses having a touch panel, such as a dashboard-mounted navigation apparatus other than the PND, a personal computer, a Personal Digital Assistant (PDA), a mobile phone, and a game apparatus.
  • a touch panel such as a dashboard-mounted navigation apparatus other than the PND, a personal computer, a Personal Digital Assistant (PDA), a mobile phone, and a game apparatus.
  • PDA Personal Digital Assistant

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Navigation (AREA)

Abstract

An information processing device includes: a display unit having a touch panel on its front surface; a movement amount calculation unit that calculates the movement amount of a touch operation based on a touch point at which the touch operation is performed with respect to the touch panel and a touch release point at which the touch operation is released from the touch panel; an operation determination unit that determines whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and a command recognition unit that recognizes whether a received command is a command corresponding to the depression operation or the gesture operation.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • The present invention contains subject matter related to Japanese Patent Application JP2007-330849 filed in the Japanese Patent Office on Dec. 21, 2007, the entire contents of which being incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an information processing device and a touch operation detection method, which are suitably applied to, e.g., a portable navigation device.
  • 2. Description of the Related Art
  • A portable navigation device (hereinafter, referred to as “PND”) is designed to be detachably attached to the dashboard of a vehicle via a cradle.
  • The PND of such a type serves as a vehicle navigation device when being attached to the dashboard via a cradle and serves as a personal navigation device when being detached from the dashboard.
  • This type of PND aims more for portability than remote-controllability. Thus, a remote controller is not provided for the PND but a user interface that receives a command from a user via a touch panel provided on the front surface of a liquid crystal display is adopted.
  • There is also known an electric book display control device equipped with a display integrated tablet on which a user uses his or her finger to perform a pointing operation to realize a page-turning operation (refer to, e.g., Jpn. Pat. Appln. Laid-Open Publication No. 2004-348755).
  • SUMMARY OF THE INVENTION
  • The PND having such a configuration displays a plurality of destination candidates so as to allow a user to select his or her destination. If the destination candidates exist over a plurality of pages, the user needs to perform a page-turning operation in order to determine the destination.
  • In such a case, where the PND performs a page-turning operation in response to the user's pointing operation using his or her finger in the same manner as in the case of the abovementioned electronic book display control device, the PND often erroneously detects that a depression operation with respect to the candidate destination has been made at the moment when the user falsely touched one of the destination candidates with his or her finger, not a page-turning button. Thus, there is a possibility that the PND cannot correctly reflect the user's intention to perform a pointing operation.
  • The present invention has been made in view of the above points and an object thereof is to propose an information processing device and a touch operation detection method capable of correctly detect a command issued in response to a user's touch operation.
  • To solve the above problem, according to an aspect of the present invention, there is provided an information processing device including: a display unit having a touch panel on its front surface; a movement amount calculation unit for calculating a movement amount of a touch operation based on a touch point at which the touch operation is performed with respect to the touch panel and a touch release point at which the touch operation is released from the touch panel; an operation determination unit for determining whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and a command recognition unit for recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • With the above configuration, the information processing device determines whether a touch operation is a button touch operation or a gesture operation depending on the movement amount of the touch operation, thereby correctly recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • According to another aspect of the present invention, there is provided a touch operation detection method including: a touch point detection step in which a touch point detection unit detects a touch point at which a touch operation is performed with respect to a touch panel provided on the front surface of a display unit; a touch release point detection step following the touch point detection step, in which a touch release point detection unit detects a touch release point at which the touch operation is released from the touch panel; a movement amount calculation step in which a movement amount calculation unit calculates a movement amount of the touch operation based on the touch point and touch release point; an operation determination step in which an operation determination unit determines whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and a command recognition step in which a command recognition unit recognizes whether a received command is a command corresponding to the depression operation or the gesture operation.
  • With the above configuration, the touch operation detection method determines whether a touch operation is a button touch operation or a gesture operation depending on the movement amount of the touch operation, thereby correctly recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
  • According to the present invention, whether a touch operation is a button touch operation or a gesture operation is determined depending on the movement amount of the touch operation, allowing correct determination on whether a received command is a command corresponding to the depression operation or the gesture operation. Therefore, an information processing device and a touch operation detection method capable of correctly recognizing a command corresponding to a user's touch operation can be realized.
  • The nature, principle and utility of the invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings in which like parts are designated by like reference numerals or characters.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the accompanying drawings:
  • FIG. 1 is a perspective view schematically showing the outer appearance of a PND;
  • FIG. 2 is a block diagram schematically showing a circuit configuration of the PND;
  • FIG. 3 is a view schematically showing a guide map image;
  • FIG. 4 is a view schematically showing an application menu screen;
  • FIG. 5 is a view schematically showing a book selection screen;
  • FIG. 6 is a view schematically showing a password requesting screen;
  • FIG. 7 is a view schematically showing a spot selection screen;
  • FIG. 8 is a view schematically showing an introduction page screen;
  • FIG. 9 is a view schematically showing a detailed page screen;
  • FIG. 10 is a flowchart showing a procedure of command recognition processing;
  • FIG. 11 is a view schematically showing a touch release operation; and
  • FIG. 12 is a schematic view for explaining a gesture operation.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • An embodiment of the present invention will be described in detail below with reference to the accompanying drawings.
  • (1) Outer Appearance of PND
  • In FIG. 1, a reference numeral 1 denotes a portable navigation device (hereinafter, referred to as “PND”) according to the embodiment of the present invention. The PND 1 has a display section 2 on which a 4.8 liquid crystal display is mounted and uses the display section 2 to present information such as a map, a current position icon PST, and a driving route toward a destination.
  • The PND 1 is held by a cradle 3 attached to a dashboard of a vehicle by a sucker 3A and is electrically connected thereto. In this state, the PND operates by a power supplied from a battery of the vehicle via the cradle 3. Further, when removed from the cradle 3, the PND 1 can operate by a power supplied from a battery incorporated therein.
  • (2) Circuit Configuration of PND
  • As shown in FIG. 2, a controller 16 having a microcomputer configuration controls the entire operation of the PND 1 according a basic program and executes various navigation processing, command recognition processing to be described later, and the like according to various application programs stores in a hard disk, etc.
  • Concretely, the PND 1 uses a GPS module 12 to demodulate satellite signals S1 from a plurality of GPS satellites received via a GPS antenna 11 and, based on orbital data obtained as a result of the demodulation and distance data between the GPS satellites and vehicle, measures the current position of the vehicle with accuracy and transmits the obtained current position data (current position data S2) to a route search section 13.
  • The orbital data is detailed orbital information (parameter) representing a detailed orbit of each of the GPS satellites. In order to measure the current position of the vehicle with accuracy, the orbital data needs to be acquired from at least three GPS satellites.
  • The route search section 13 reads out map data S3 representing the current position of the vehicle and its surrounding area from a map data storage section 14 based on the current position data S2, searches the map data S3 for a driving route from the current position to a destination which has been set by a user, generates a route guide map S4 including the driving route, and transmits the route guide map S4 to the display section 2 constituted by a liquid crystal display.
  • As shown in FIG. 3, the display section 2 displays a guide map image G1 corresponding to the route guide map data S4 and thereby allows a user to visually confirm the driving route from a current position icon PST to a destination (not shown).
  • When the controller 16 of the PND 1 receives a command S7 which has been issued in response to depression of an application menu button MB displayed in the lower left of the guide map image G1 via a touch panel 15, the controller 16 reads out application menu screen data S5 from a data storage section 17.
  • The controller 16 of the PND 1 then displays an application menu screen G2 corresponding to the application menu screen data S5 on the display section 2 as shown in FIG. 4. In the application menu screen G2, a music button B1 for reproduction of music, a video button B2 for reproduction of video, and a guidebook button B3 for display of Point Of Interest (POI) which can be candidates of a destination or a stop-off point are displayed as choices to be selected through a user's touch operation.
  • When detecting a touch operation made to the guidebook button B3 in the application menu screen G2 via the touch panel 15 (FIG. 2), the controller 16 of the PND 1 reads out a book selection screen data S6 associated with the guidebook button B3 from the data storage section 17.
  • The controller 16 of the PND 1 then displays a book selection screen G3 corresponding to the book selection screen data S6 on the display section 2 as shown in FIG. 5. The book selection screen G3 displays a list of choices of guidebooks such as “100 spots” item KL1, “1,000 spots” item KL2, “illumination information −free version” item KL3, “Ekiben (train lunch) search −free version 2)” item KL4, and an UP button UB1 and a down button DB1 for scrolling the list in the up-down direction.
  • The book selection screen G3 further displays a “return” button BB1. When the “return” button BB1 is touched by a user, the controller 16 of the PND 1 sets back the display content of the display section 2 from the book selection screen G3 to application menu screen G2.
  • When detecting, via the touch panel 15, that the “1,000 spots” item KL2 on the book selection screen G3 has been touched by a user, the controller 16 of the PND 1 reads out password requesting screen data S7 from the data storage section 17 if a password has been set for the “1,000 spots” item KL2.
  • Then, as shown in FIG. 6, the controller 16 of the PND 1 displays a password requesting screen G4 corresponding to the password requesting screen data S7 on the display section 2. The password requesting screen G4 displays a password input field IP, alphabet keys AK, a numeric key NK, a delete key DK, an entry key EK, as well as, a left button LB1 and a right button RB2 for moving a cursor CS of the password input field IP in the left-right direction and a “return” button BB2.
  • Also in this case, when the “return” button BB2 is touched by a user, the controller 16 of the PND 1 sets back the display content of the display section 2 from the password requesting screen G4 to book selection screen G3.
  • The controller 16 of the PND 1 displays a password input in response to a touch operation with respect to the alphabet key AK or numeric key EK in the password input field IP of the password requesting screen G4 and determines the password displayed in the password input field IP in response to a touch operation with respect to the enter key EK.
  • When authentication based on the password input to the password input field IP of the password requesting screen G4 is successful, the controller 16 of the PND 1 reads out spot selection screen data S8 from the data storage section 17.
  • The controller 16 of the PND 1 then displays a spot selection screen G5 corresponding to the spot selection screen data S8 as shown in FIG. 7. The spot selection screen G5 displays, as the POIs which can be candidates of a destination or a stop-off point, a list of spot items SA1 to SA4: “G0001 Jonan Park”, “G0002 Gold Bridge”, “G0003 Port-side Tower”, and “G0004 Yumemigaoka” and a up button UB2 and a down button DB2 for scrolling the list in the up-down direction.
  • Further, the spot selection screen G5 displays a “return” button BB3. When the “return” button BB3 is touched by a user, the controller 16 of the PND 1 sets back the display content of the display section 2 from the spot selection screen G5 to book selection screen G3.
  • When the controller 16 of the PND 1 detects via the touch panel 15 that e.g., the spot item SA1: “G0001 Jonan Park” displayed on the spot selection screen G5 has been touched, the controller 16 of the PND 1 reads out introduction page screen data S8 about “G0001 Jonan Park” from the data storage section 17.
  • The controller 16 of the PND 1 then displays an introduction page screen G6 corresponding to the introduction page screen data S8 on the display section 2 as shown in FIG. 8. The introduction page screen G6 displays a guidebook name display field TL1, a photo display field GA, a spot item name display field NL1, a location specifying information display field AL, a “to-map” button MB1 for returning to the abovementioned guide map image G1 (FIG. 3), and a return button BB4.
  • The introduction page screen G6 further displays, in the lower center thereof, a next button NT1 with an arrow for displaying a next page and a back button BK1 with an arrow for displaying a previous page. When the next button NT1 is touched by a user, the controller 16 of the PND 1 reads out detailed page screen data S9 concerning the “Jonan Park” from the data storage section 17.
  • The controller 16 of the PND 1 then displays a detailed page screen G7 corresponding to the detailed page screen data S9 on the display section 2 as shown in FIG. 9. The detailed page screen G7 displays a guide book name display field TL2, a spot item name display field NL2, a detailed content display field DL representing detailed content of the “G0001 Jonan Park”, a “to-map” button MB2 for returning to the guide map image G1 (FIG. 3), and a return button BB5.
  • The controller 16 of the PND 1 allows a user to visually confirm the detailed content display field DL of the detailed page screen G7 in the manner as described above. As a result, the user can grasp the detailed content of the “G0001 Jonan Park” and determine whether to set or not the “G0001 Jonan Park” as his or her destination or stop-off point.
  • Also in this case, the detailed page screen G7 displays, in the lower center thereof, a next button NT2 with an arrow for displaying a next page and a back button BK2 with an arrow for displaying a previous page. When the back button BK2 is touched by a user, the controller 16 of the PND 1 sets back the content of the display section 2 to the introduction page screen G6 concerning the “G0001 Jonan Park”.
  • In addition, when switching from the introduction page screen G6 to detailed page screen G7 (i.e., turn the page) or when switching from the detailed page screen G7 to introduction page screen G6 (i.e., bring back the page), the controller 16 of the PND 1 can perform the switching operation not only by detecting a user's touch operation with respect to the next buttons NT1, NT2 and back buttons BK1, BK2, but also by understanding a command corresponding to a finger gesture with respect to the touch panel 15 of the display section 2. In this case, the controller 16 of the PND 1 can perform the page switching processing according to the command. The details of this command recognition processing will be described below.
  • (3) Procedure of Command Recognition Processing
  • As shown in FIG. 9, the controller 16 of the PND 1 enters the processing from a starting step of a routine RT1 according to a command recognition processing program which is an application program started from a hard disk and proceeds to step SP1. In step SP1, when detecting that a touch operation has been made with respect to the touch panel 15 by a user's finger, the controller 16 proceeds to step SP2.
  • In step SP2, when detecting a touch release operation, i.e., detecting that the user's finger has been separated from the touch panel 15, the controller 16 of the PND 1 proceeds to step SP3.
  • In step SP3, the controller 16 of the PND 1 calculates the movement amount of the finger on the display section 2 from the touch point detected in step SP1 to touch release point detected in step SP2 and proceeds to step SP4.
  • In step SP4, the controller 16 of the PND 1 determines whether the movement amount of the finger calculated in step SP3 is not more than a predetermined threshold (e.g., 5 mm). When an affirmative result has been obtained, which means that there is little movement, the controller 16 of the PND 1 determines that a gesture operation based on a drag operation of the finger has not been performed and proceeds to step SP5.
  • Since the finger motion does not mean a gesture operation, the controller 16 of the PND 1 determines, in step SP5, whether any button (e.g., next button NT1) exists at the touch release point as shown in FIG. 11.
  • When a negative result has been obtained in step SP5, which means that the finger motion means neither a gesture operation nor a button touch operation, the controller 16 of the PND 1 determines that no command has been input and proceeds to step SP8 where the controller 16 of the PND 1 ends this flow without doing anything.
  • On the other hand, when an affirmative result has been obtained in step SP5, which means that the finger motion means a button touch operation with respect to, e.g., the next button NT1, the controller 16 of the PND 1 proceeds to step SP6.
  • In step SP6, the controller 16 of the PND 1 recognizes that a command corresponding to the button touch operation with respect to the next button NT1 has been issued and switches the display content from, e.g., the introduction page screen G6 (FIG. 8) to detailed page screen G7 (FIG. 9). Then, the controller 16 of the PND 1 proceeds to step SP8 and ends this flow.
  • When a negative result has been obtained in step SP4, i.e., the movement amount of the finger calculated in step SP3 exceeds a predetermined threshold (e.g., 5 mm), which means that a gesture operation based on a drag operation has been performed such that the finger is dragged from the left to right on the display section 2 as shown in FIG. 12, the controller 16 of the PND 1 proceeds to step SP7.
  • In step SP7, the controller 16 of the PND 1 recognizes, e.g., a page-turning command corresponding to the pattern of a gesture operation and, according to the page-turning command, switches the display content from the introduction page screen G6 (FIG. 8) to detailed page screen G7 (FIG. 9) in such a manner as if the page of a book were turned over. Then, the controller 16 of the PND 1 proceeds to step SP8 and ends this flow.
  • When recognizing that a gesture operation of a motion pattern in which the finger is dragged from the left to right on the display section 2 has been performed, the controller 16 of the PND 1 performs the abovementioned page-turning operation, while when recognizing that a gesture operation of a motion pattern in which the finger is dragged from the right to left on the display section 2 has been performed, the controller 16 of the PND 1 performs a page turning back operation to set back the display content from the detailed page screen G7 (FIG. 9) to introduction page screen G6 (FIG. 8).
  • When recognizing that a gesture operation of a motion pattern in which the finger is moved so as to draw a triangle has been performed, the controller 16 of the PND 1 recognizes that the motion means a command to search a driving route from the current position to the home of a user and displays the driving route obtained as a result of the search on the display section 2.
  • When recognizing that a gesture operation of a motion pattern in which the finger is swirled clockwise has been performed, the controller 16 of the PND 1 enlarges a map currently displayed on the display section 2, while when recognizing that a gesture operation of a motion pattern in which the finger is swirled counterclockwise has been performed, the controller 16 of the PND 1 reduces the map size.
  • (4) Operation and Effect
  • With the above configuration, the controller 16 of the PND 1 determines whether a finger motion is a button touch operation or a gesture operation based on a drag operation depending on the movement amount of the finger on the display section 2 from the touch point to touch release point, recognizes a command corresponding to the button touch operation or gesture operation, and performs predetermined processing according to the command.
  • That is, even if any button displayed on the display section 2 is touched by the finger, the controller 16 of the PND 1 does not determine whether the finger motion is a button touch operation or a gesture operation until the finger is separated from the button.
  • Therefore, even if a point at which any button exists is accidentally touched by the finger although the user intended to perform a gesture operation, it is possible to prevent the controller 16 of the PND 1 from erroneously determining that the finger motion is a button touch operation.
  • Further, even if the finger is applied to any button and is moved by an amount not more than a predetermined threshold until it is separated from the display section 2, the controller 16 of the PND 1 does not erroneously determine that the finger motion is a gesture operation but correctly determines that the finger motion is a button touch operation with respect to a button.
  • According to the above configuration, the controller 16 of the PND 1 correctly determines whether a touch operation is a button touch operation instantaneously made with respect to a button or a gesture operation based on a drag operation to thereby correctly and reliably execute predetermined processing specified by a command corresponding to the button touch operation or gesture operation.
  • (5) Other Embodiments
  • In the above embodiment, a value of 5 mm is used as a threshold of the movement amount of the finger from its touch point to touch release point to determine whether the finger motion is a button touch operation or a gesture operation. However, the present invention is not limited to this. For example, the value of the threshold may arbitrarily set depending on various factors such as the size of the display area of the display section 2 or size of the button.
  • Further, in the above embodiment, the pattern of a gesture operation includes the drag-like motion of the finger from the right to left, drag-like motion of the finger from the left to right, motion of the finger such as one by which a triangle is drawn, and clockwise or counterclockwise swirled motion of the finger. However, the present invention is not limited to this. For example, the pattern of a gesture operation may include a motion of the finger such as one by which “x” or “□” is drawn, a finger tapping motion, and other various types of motions.
  • Further, in the above embodiment, the controller 16 of the PND 1 determines that the finger motion is a gesture operation based on the movement amount of the finger on the display section 2 from the touch point to touch release point irrespective of whether any button displayed on the display section 2 is touched by the finger. However, the present invention is not limited to this but the controller 16 of the PND 1 may determine that the finger motion is a gesture operation based on the movement amount of the finger on the display section 2 where no button is displayed.
  • Further, in the above embodiment, the controller 16 of the PND 1 executes the command recognition processing procedure (FIG. 10) of the routine RT1 according to the command recognition processing program which is an application program stored in a hard disk. However, the present invention is not limited to this but the command recognition processing procedure may be executed according to a command recognition processing program installed from a program storage medium such as a compact disk (CD) or a semiconductor memory, one downloaded from the Internet, or one installed through other various routes.
  • Further, in the above embodiment, the PND 1 as an information processing device according to the embodiment of the present invention is constituted by the display section 2 serving as a display unit, and controller 16 serving as a movement amount calculation unit, an operation determination unit, and a command recognition unit. Alternatively, however, the information processing device according to the embodiment of the present invention may be constituted by the display unit, movement amount calculation unit, operation determination unit, and command recognition unit including other various circuit configurations.
  • The information processing device and touch operation detection method can be applied to various electronic apparatuses having a touch panel, such as a dashboard-mounted navigation apparatus other than the PND, a personal computer, a Personal Digital Assistant (PDA), a mobile phone, and a game apparatus.
  • It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. An information processing device comprising:
display means having a touch panel on its front surface;
movement amount calculation means for calculating a movement amount of a touch operation based on a touch point at which the touch operation is performed with respect to the touch panel and a touch release point at which the touch operation is released from the touch panel;
operation determination means for determining whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and
command recognition means for recognizing whether a received command is a command corresponding to the depression operation or the gesture operation.
2. The information processing device according to claim 1, wherein
the operation determination means determines that the touch operation is a gesture operation when the movement amount exceeds a predetermined threshold.
3. The information processing device according to claim 2, wherein
the command recognition means recognizes the command associated with a pattern of the gesture operation.
4. The information processing device according to claim 1, wherein
the operation determination means determines that, when the movement amount is not more than a predetermined threshold and a button is displayed at the touch release point at which the touch operation is released, the touch operation is the depression operation with respect to the button.
5. A touch operation detection method comprising:
a touch point detection step in which touch point detection means detects a touch point at which a touch operation is performed with respect to a touch panel provided on the front surface of display means;
a touch release point detection step following the touch point detection step, in which touch release point detection means detects a touch release point at which the touch operation is released from the touch panel;
a movement amount calculation step in which movement amount calculation means calculates a movement amount of the touch operation based on the touch point and the touch release point;
an operation determination step in which operation determination means determines whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and
a command recognition step in which command recognition means recognizes whether a received command is a command corresponding to the depression operation or the gesture operation.
6. The touch operation detection method according to claim 5, wherein
the operation determination step determines that the touch operation is the gesture operation when the movement amount exceeds a predetermined threshold.
7. The touch operation detection method according to claim 6, wherein
the command recognition step recognizes the command associated with a pattern of the gesture operation.
8. The touch operation detection method according to claim 5, wherein
the operation determination step determines that, when the movement amount is not more than a predetermined threshold and a button is displayed at the touch release point at which the touch operation is released, the touch operation is the depression operation with respect to the button.
9. An information processing device comprising:
a display unit having a touch panel on its front surface;
a movement amount calculation unit that calculates a movement amount of a touch operation based on a touch point at which the touch operation is performed with respect to the touch panel and a touch release point at which the touch operation is released from the touch panel;
an operation determination unit that determines whether the touch operation is a depression operation or a gesture operation depending on the calculated movement amount; and
a command recognition unit that recognizes whether a received command is a command corresponding to the depression operation or the gesture operation.
US12/326,991 2007-12-21 2008-12-03 Information processing device and touch operation detection method Abandoned US20090160803A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/938,932 US10168888B2 (en) 2007-12-21 2013-07-10 Information processing device and touch operation detection method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007-330849 2007-12-21
JP2007330849A JP5239328B2 (en) 2007-12-21 2007-12-21 Information processing apparatus and touch motion recognition method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/938,932 Continuation US10168888B2 (en) 2007-12-21 2013-07-10 Information processing device and touch operation detection method

Publications (1)

Publication Number Publication Date
US20090160803A1 true US20090160803A1 (en) 2009-06-25

Family

ID=40788019

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/326,991 Abandoned US20090160803A1 (en) 2007-12-21 2008-12-03 Information processing device and touch operation detection method
US13/938,932 Expired - Fee Related US10168888B2 (en) 2007-12-21 2013-07-10 Information processing device and touch operation detection method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/938,932 Expired - Fee Related US10168888B2 (en) 2007-12-21 2013-07-10 Information processing device and touch operation detection method

Country Status (2)

Country Link
US (2) US20090160803A1 (en)
JP (1) JP5239328B2 (en)

Cited By (175)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110167350A1 (en) * 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US20110221686A1 (en) * 2010-03-15 2011-09-15 Samsung Electronics Co., Ltd. Portable device and control method thereof
US20110238290A1 (en) * 2010-03-24 2011-09-29 Telenav, Inc. Navigation system with image assisted navigation mechanism and method of operation thereof
US20120299854A1 (en) * 2011-05-25 2012-11-29 Kyocera Corporation Mobile electronic device and input method
US20130047112A1 (en) * 2010-03-11 2013-02-21 X Method and device for operating a user interface
DE102011116187A1 (en) 2011-10-14 2013-04-18 Volkswagen Aktiengesellschaft Method for providing user interface for interaction with e.g. infotainment system, in vehicle, involves expanding region in which graphic object e.g. electronic log book, is displayed on touch screen by detected gesture for page turning
CN103197881A (en) * 2012-01-10 2013-07-10 佳能株式会社 Display control apparatus and control method thereof
WO2013158533A1 (en) 2012-04-16 2013-10-24 Nuance Communications, Inc. Low-attention gestural user interface
USD702140S1 (en) 2013-01-02 2014-04-08 Garmin Switzerland Gmbh Navigation device
US20140240101A1 (en) * 2011-09-15 2014-08-28 Nec Casio Mobile Communications, Ltd. Device and method for processing write information of electronic tag
CN104034339A (en) * 2013-03-04 2014-09-10 观致汽车有限公司 Method and apparatus for browsing electronic map on vehicle navigator
USD742258S1 (en) 2014-01-15 2015-11-03 Garmin Switzerland Gmbh Navigation device
US20150346945A1 (en) * 2011-02-10 2015-12-03 Sharp Kabushiki Kaisha Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
USD751933S1 (en) 2014-12-22 2016-03-22 Garmin Switzerland Gmbh Navigation device
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US20160155211A1 (en) * 2014-11-28 2016-06-02 Kyocera Document Solutions Inc. Display device and recording medium used for on-screen operation such as swipe operation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US9648181B2 (en) 2014-12-17 2017-05-09 Kyocera Document Solutions Inc. Touch panel device and image processing apparatus
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US9753545B2 (en) 2012-08-17 2017-09-05 Nec Solutions Innovators, Ltd. Input device, input method, and storage medium
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10083690B2 (en) 2014-05-30 2018-09-25 Apple Inc. Better resolution when referencing to concepts
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10169329B2 (en) 2014-05-30 2019-01-01 Apple Inc. Exemplar-based natural language processing
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
WO2019062243A1 (en) * 2017-09-26 2019-04-04 出门问问信息科技有限公司 Identification method and apparatus for touch operation, and electronic device
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US10261630B2 (en) 2012-04-27 2019-04-16 Panasonic Intellectual Property Corporation Of America Input device, input support method, and program
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
US10332518B2 (en) 2017-05-09 2019-06-25 Apple Inc. User interface for correcting recognition errors
US10346118B2 (en) 2016-10-06 2019-07-09 Toyota Jidosha Kabushiki Kaisha On-vehicle operation device
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US10356243B2 (en) 2015-06-05 2019-07-16 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10403283B1 (en) 2018-06-01 2019-09-03 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US10410637B2 (en) 2017-05-12 2019-09-10 Apple Inc. User-specific acoustic models
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10482874B2 (en) 2017-05-15 2019-11-19 Apple Inc. Hierarchical belief states for digital assistants
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US10643611B2 (en) 2008-10-02 2020-05-05 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10684703B2 (en) 2018-06-01 2020-06-16 Apple Inc. Attention aware virtual assistant dismissal
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10699717B2 (en) 2014-05-30 2020-06-30 Apple Inc. Intelligent assistant for home automation
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10714117B2 (en) 2013-02-07 2020-07-14 Apple Inc. Voice trigger for a digital assistant
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US10748546B2 (en) 2017-05-16 2020-08-18 Apple Inc. Digital assistant services based on device capabilities
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10755703B2 (en) 2017-05-11 2020-08-25 Apple Inc. Offline personal assistant
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
US10789945B2 (en) 2017-05-12 2020-09-29 Apple Inc. Low-latency intelligent automated assistant
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11010127B2 (en) 2015-06-29 2021-05-18 Apple Inc. Virtual assistant for media playback
US11023513B2 (en) 2007-12-20 2021-06-01 Apple Inc. Method and apparatus for searching using an active ontology
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US11069336B2 (en) 2012-03-02 2021-07-20 Apple Inc. Systems and methods for name pronunciation
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US11217251B2 (en) 2019-05-06 2022-01-04 Apple Inc. Spoken notifications
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US11231904B2 (en) 2015-03-06 2022-01-25 Apple Inc. Reducing response latency of intelligent automated assistants
US11237797B2 (en) 2019-05-31 2022-02-01 Apple Inc. User activity shortcut suggestions
US11269678B2 (en) 2012-05-15 2022-03-08 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
US11314370B2 (en) 2013-12-06 2022-04-26 Apple Inc. Method for extracting salient dialog usage from live data
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US11468282B2 (en) 2015-05-15 2022-10-11 Apple Inc. Virtual assistant in a communication session
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
US11495218B2 (en) 2018-06-01 2022-11-08 Apple Inc. Virtual assistant operation in multi-device environments
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8413065B2 (en) 2009-09-07 2013-04-02 Qualcomm Incorporated User interface methods for ending an application
JP5537458B2 (en) * 2011-02-10 2014-07-02 シャープ株式会社 Image display device capable of touch input, control device for display device, and computer program
CN103988163A (en) 2011-12-07 2014-08-13 国际商业机器公司 Method of displaying electronic document, and apparatus and computer program thereof
JP5761526B2 (en) 2012-03-01 2015-08-12 コニカミノルタ株式会社 Operation display device
JP5991509B2 (en) * 2012-03-02 2016-09-14 コニカミノルタ株式会社 Information processing apparatus and program
JP5945926B2 (en) * 2012-03-26 2016-07-05 コニカミノルタ株式会社 Operation display device
JP5461735B2 (en) * 2012-04-27 2014-04-02 パナソニック株式会社 Input device, input support method, and program
JPWO2013191028A1 (en) 2012-06-22 2016-05-26 ソニー株式会社 Detection device, detection method, and program
JP6399834B2 (en) * 2014-07-10 2018-10-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
CN104536680B (en) * 2014-12-03 2018-06-19 惠州Tcl移动通信有限公司 Mobile terminal operation triggering method and system based on the touch screen operation time
JP2016042383A (en) * 2015-11-25 2016-03-31 カシオ計算機株式会社 User operation processing apparatus, user operation processing method, and program
JP6323960B2 (en) * 2016-08-02 2018-05-16 本田技研工業株式会社 Input device
CN106775743B (en) * 2016-12-27 2020-05-19 宇龙计算机通信科技(深圳)有限公司 Bottom tray and virtual key display method, device and terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094197A (en) * 1993-12-21 2000-07-25 Xerox Corporation Graphical keyboard
US20030206199A1 (en) * 2002-05-03 2003-11-06 Nokia Corporation Method and apparatus for interaction with a user interface
US20050248542A1 (en) * 2004-05-07 2005-11-10 Pentax Corporation Input device and method for controlling input device
US20080184173A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Controlling multiple map application operations with a single gesture
US20080201650A1 (en) * 2007-01-07 2008-08-21 Lemay Stephen O Web-Clip Widgets on a Portable Multifunction Device
US20090102806A1 (en) * 2007-10-19 2009-04-23 Steve Tomkins System having user interface using object selection and gestures

Family Cites Families (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0644185B2 (en) * 1985-04-26 1994-06-08 日本電装株式会社 Vehicle guide device
JPH03269717A (en) 1990-03-20 1991-12-02 Hitachi Ltd Method and device for selection of data
US5608635A (en) * 1992-04-14 1997-03-04 Zexel Corporation Navigation system for a vehicle with route recalculation between multiple locations
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
US5848373A (en) * 1994-06-24 1998-12-08 Delorme Publishing Company Computer aided map location system
JP3181181B2 (en) * 1994-11-11 2001-07-03 シャープ株式会社 Document information processing device
US5938720A (en) * 1995-02-09 1999-08-17 Visteon Technologies, Llc Route generation in a vehicle navigation system
JPH0962446A (en) * 1995-08-22 1997-03-07 Matsushita Electric Works Ltd Touch panel input method and device therefor
KR0183524B1 (en) * 1995-09-27 1999-04-15 모리 하루오 Navigation system for displaying a structure-shape map 51 g08g 1/0969
JPH10269021A (en) * 1997-03-25 1998-10-09 Sharp Corp Touch panel input device
US6057845A (en) * 1997-11-14 2000-05-02 Sensiva, Inc. System, method, and apparatus for generation and recognizing universal commands
JPH11259206A (en) 1998-03-09 1999-09-24 Fujitsu Ltd Infrared detection system input device
JP3385965B2 (en) * 1998-04-20 2003-03-10 セイコーエプソン株式会社 Input device and input method
US6539080B1 (en) * 1998-07-14 2003-03-25 Ameritech Corporation Method and system for providing quick directions
JP2000137564A (en) * 1998-11-02 2000-05-16 Pioneer Electronic Corp Picture operating device and its method
JP4155671B2 (en) * 1999-07-12 2008-09-24 アルパイン株式会社 Car navigation system
EP1266282B1 (en) * 2000-03-17 2010-04-21 Microsoft Corporation System and method for non-uniform scaled mapping
JP3513084B2 (en) * 2000-06-14 2004-03-31 株式会社東芝 Information processing system, information equipment and information processing method
US6897853B2 (en) * 2000-11-10 2005-05-24 Microsoft Corp. Highlevel active pen matrix
US6405129B1 (en) * 2000-11-29 2002-06-11 Alpine Electronics, Inc. Method of displaying POI icons for navigation apparatus
US6542817B2 (en) * 2001-03-13 2003-04-01 Alpine Electronics, Inc. Route search method in navigation system
US6571169B2 (en) * 2001-03-16 2003-05-27 Alpine Electronics, Inc. Destination input method in navigation system and navigation system
US6640185B2 (en) * 2001-07-21 2003-10-28 Alpine Electronics, Inc. Display method and apparatus for navigation system
JP2003072488A (en) * 2001-08-31 2003-03-12 Sony Corp Onboard device and processing method of vehicle and vehicle information
JP3842617B2 (en) * 2001-10-31 2006-11-08 株式会社ケンウッド Touch panel input device, audio device and input method
JP2003172624A (en) * 2001-12-10 2003-06-20 Mitsubishi Electric Corp Apparatus and method for searching intersection
JP2003263144A (en) 2002-03-11 2003-09-19 Toshiba Corp Information processing apparatus and display control method
US7103854B2 (en) * 2002-06-27 2006-09-05 Tele Atlas North America, Inc. System and method for associating text and graphical views of map information
JP4300767B2 (en) * 2002-08-05 2009-07-22 ソニー株式会社 Guide system, content server, portable device, information processing method, information processing program, and storage medium
US7272497B2 (en) * 2003-03-24 2007-09-18 Fuji Jukogyo Kabushiki Kaisha Vehicle navigation system with multi-use display
US6856901B2 (en) * 2003-06-02 2005-02-15 Alpine Electronics, Inc. Display method and apparatus for navigation system
JP4192731B2 (en) * 2003-09-09 2008-12-10 ソニー株式会社 Guidance information providing apparatus and program
JP4855654B2 (en) * 2004-05-31 2012-01-18 ソニー株式会社 On-vehicle device, on-vehicle device information providing method, on-vehicle device information providing method program, and on-vehicle device information providing method program
JP2004348755A (en) 2004-06-23 2004-12-09 Sharp Corp Device for controlling electronic book display
US7761814B2 (en) * 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
JP2006215915A (en) * 2005-02-04 2006-08-17 Ricoh Co Ltd Image processor and method for controlling image processor
TW200632770A (en) * 2005-03-07 2006-09-16 Giga Byte Comm Inc POI data structure and method for operating and applying the same
JP4135110B2 (en) * 2005-03-15 2008-08-20 ソニー株式会社 Point search device and search method
EP2280483A1 (en) * 2005-06-03 2011-02-02 Synaptics, Incorporated Methods and systems for shielding a charge transfer capacitance sensor for proximity detection
US7449895B2 (en) 2005-06-03 2008-11-11 Synaptics Incorporated Methods and systems for detecting a capacitance using switched charge transfer techniques
JP5395429B2 (en) * 2005-06-03 2014-01-22 シナプティクス インコーポレイテッド Method and system for detecting capacitance using sigma delta measurement
JP2007018095A (en) 2005-07-05 2007-01-25 Matsushita Electric Ind Co Ltd Electronic display device
EP1840520B1 (en) * 2006-03-31 2008-09-24 Research In Motion Limited User interface methods and apparatus for controlling the visual display of maps having selectable map elements in mobile communications devices
US10313505B2 (en) * 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
KR100875374B1 (en) * 2007-03-07 2008-12-23 팅크웨어(주) Tree structure name retrieval method and its system
US7990394B2 (en) * 2007-05-25 2011-08-02 Google Inc. Viewing and navigating within panoramic images, and applications thereof
US8874364B2 (en) * 2007-08-02 2014-10-28 Volkswagen Ag Navigation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094197A (en) * 1993-12-21 2000-07-25 Xerox Corporation Graphical keyboard
US20030206199A1 (en) * 2002-05-03 2003-11-06 Nokia Corporation Method and apparatus for interaction with a user interface
US20050248542A1 (en) * 2004-05-07 2005-11-10 Pentax Corporation Input device and method for controlling input device
US20080201650A1 (en) * 2007-01-07 2008-08-21 Lemay Stephen O Web-Clip Widgets on a Portable Multifunction Device
US20080184173A1 (en) * 2007-01-31 2008-07-31 Microsoft Corporation Controlling multiple map application operations with a single gesture
US20090102806A1 (en) * 2007-10-19 2009-04-23 Steve Tomkins System having user interface using object selection and gestures

Cited By (250)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9646614B2 (en) 2000-03-16 2017-05-09 Apple Inc. Fast, language-independent method for user authentication by voice
US11928604B2 (en) 2005-09-08 2024-03-12 Apple Inc. Method and apparatus for building an intelligent automated assistant
US10318871B2 (en) 2005-09-08 2019-06-11 Apple Inc. Method and apparatus for building an intelligent automated assistant
US10568032B2 (en) 2007-04-03 2020-02-18 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US11023513B2 (en) 2007-12-20 2021-06-01 Apple Inc. Method and apparatus for searching using an active ontology
US10381016B2 (en) 2008-01-03 2019-08-13 Apple Inc. Methods and apparatus for altering audio output signals
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US9865248B2 (en) 2008-04-05 2018-01-09 Apple Inc. Intelligent text-to-speech conversion
US9626955B2 (en) 2008-04-05 2017-04-18 Apple Inc. Intelligent text-to-speech conversion
US9535906B2 (en) 2008-07-31 2017-01-03 Apple Inc. Mobile device having human language translation capability with positional feedback
US10108612B2 (en) 2008-07-31 2018-10-23 Apple Inc. Mobile device having human language translation capability with positional feedback
US11348582B2 (en) 2008-10-02 2022-05-31 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US10643611B2 (en) 2008-10-02 2020-05-05 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US9858925B2 (en) 2009-06-05 2018-01-02 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US11080012B2 (en) 2009-06-05 2021-08-03 Apple Inc. Interface for a virtual digital assistant
US10795541B2 (en) 2009-06-05 2020-10-06 Apple Inc. Intelligent organization of tasks items
US10475446B2 (en) 2009-06-05 2019-11-12 Apple Inc. Using context information to facilitate processing of commands in a virtual assistant
US10283110B2 (en) 2009-07-02 2019-05-07 Apple Inc. Methods and apparatuses for automatic speech recognition
US20110167350A1 (en) * 2010-01-06 2011-07-07 Apple Inc. Assist Features For Content Display Device
US10496753B2 (en) 2010-01-18 2019-12-03 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US9548050B2 (en) 2010-01-18 2017-01-17 Apple Inc. Intelligent automated assistant
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US10741185B2 (en) 2010-01-18 2020-08-11 Apple Inc. Intelligent automated assistant
US11423886B2 (en) 2010-01-18 2022-08-23 Apple Inc. Task flow identification based on user intent
US10706841B2 (en) 2010-01-18 2020-07-07 Apple Inc. Task flow identification based on user intent
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US10679605B2 (en) 2010-01-18 2020-06-09 Apple Inc. Hands-free list-reading by intelligent automated assistant
US10553209B2 (en) 2010-01-18 2020-02-04 Apple Inc. Systems and methods for hands-free notification summaries
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US10692504B2 (en) 2010-02-25 2020-06-23 Apple Inc. User profiling for voice input processing
US9633660B2 (en) 2010-02-25 2017-04-25 Apple Inc. User profiling for voice input processing
US10049675B2 (en) 2010-02-25 2018-08-14 Apple Inc. User profiling for voice input processing
US9283829B2 (en) * 2010-03-11 2016-03-15 Volkswagen Ag Process and device for displaying different information for driver and passenger of a vehicle
US20130047112A1 (en) * 2010-03-11 2013-02-21 X Method and device for operating a user interface
EP3270276A1 (en) * 2010-03-15 2018-01-17 Samsung Electronics Co., Ltd. Portable device and control method thereof
US20110221686A1 (en) * 2010-03-15 2011-09-15 Samsung Electronics Co., Ltd. Portable device and control method thereof
EP2367098A3 (en) * 2010-03-15 2014-07-30 Samsung Electronics Co., Ltd. Portable device and control method thereof
US9691281B2 (en) * 2010-03-24 2017-06-27 Telenav, Inc. Navigation system with image assisted navigation mechanism and method of operation thereof
US20110238290A1 (en) * 2010-03-24 2011-09-29 Telenav, Inc. Navigation system with image assisted navigation mechanism and method of operation thereof
US10191648B2 (en) * 2011-02-10 2019-01-29 Sharp Kabushiki Kaisha Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
US20150346945A1 (en) * 2011-02-10 2015-12-03 Sharp Kabushiki Kaisha Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus
US10102359B2 (en) 2011-03-21 2018-10-16 Apple Inc. Device access using voice authentication
US10417405B2 (en) 2011-03-21 2019-09-17 Apple Inc. Device access using voice authentication
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US20120299854A1 (en) * 2011-05-25 2012-11-29 Kyocera Corporation Mobile electronic device and input method
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US11350253B2 (en) 2011-06-03 2022-05-31 Apple Inc. Active transport based notifications
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US11120372B2 (en) 2011-06-03 2021-09-14 Apple Inc. Performing actions associated with task items that represent tasks to perform
US9798393B2 (en) 2011-08-29 2017-10-24 Apple Inc. Text correction processing
US20140240101A1 (en) * 2011-09-15 2014-08-28 Nec Casio Mobile Communications, Ltd. Device and method for processing write information of electronic tag
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
DE102011116187A1 (en) 2011-10-14 2013-04-18 Volkswagen Aktiengesellschaft Method for providing user interface for interaction with e.g. infotainment system, in vehicle, involves expanding region in which graphic object e.g. electronic log book, is displayed on touch screen by detected gesture for page turning
CN107037973A (en) * 2012-01-10 2017-08-11 佳能株式会社 Display control unit and its control method
US9086792B2 (en) * 2012-01-10 2015-07-21 Canon Kabushiki Kaisha Display control apparatus and control method thereof
US20130176256A1 (en) * 2012-01-10 2013-07-11 Canon Kabushiki Kaisha Display control apparatus and control method thereof
CN103197881A (en) * 2012-01-10 2013-07-10 佳能株式会社 Display control apparatus and control method thereof
US11069336B2 (en) 2012-03-02 2021-07-20 Apple Inc. Systems and methods for name pronunciation
US9483461B2 (en) 2012-03-06 2016-11-01 Apple Inc. Handling speech synthesis of content for multiple languages
EP2838774A4 (en) * 2012-04-16 2015-05-20 Nuance Communications Inc Low-attention gestural user interface
WO2013158533A1 (en) 2012-04-16 2013-10-24 Nuance Communications, Inc. Low-attention gestural user interface
US10261630B2 (en) 2012-04-27 2019-04-16 Panasonic Intellectual Property Corporation Of America Input device, input support method, and program
US9953088B2 (en) 2012-05-14 2018-04-24 Apple Inc. Crowd sourcing information to fulfill user requests
US11269678B2 (en) 2012-05-15 2022-03-08 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US10079014B2 (en) 2012-06-08 2018-09-18 Apple Inc. Name recognition system
US9495129B2 (en) 2012-06-29 2016-11-15 Apple Inc. Device, method, and user interface for voice-activated navigation and browsing of a document
US9753545B2 (en) 2012-08-17 2017-09-05 Nec Solutions Innovators, Ltd. Input device, input method, and storage medium
US9971774B2 (en) 2012-09-19 2018-05-15 Apple Inc. Voice-based media searching
USD702140S1 (en) 2013-01-02 2014-04-08 Garmin Switzerland Gmbh Navigation device
US10978090B2 (en) 2013-02-07 2021-04-13 Apple Inc. Voice trigger for a digital assistant
US10714117B2 (en) 2013-02-07 2020-07-14 Apple Inc. Voice trigger for a digital assistant
CN104034339A (en) * 2013-03-04 2014-09-10 观致汽车有限公司 Method and apparatus for browsing electronic map on vehicle navigator
US9633674B2 (en) 2013-06-07 2017-04-25 Apple Inc. System and method for detecting errors in interactions with a voice-based digital assistant
US9620104B2 (en) 2013-06-07 2017-04-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US9582608B2 (en) 2013-06-07 2017-02-28 Apple Inc. Unified ranking with entropy-weighted information for phrase-based semantic auto-completion
US9966060B2 (en) 2013-06-07 2018-05-08 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
US10657961B2 (en) 2013-06-08 2020-05-19 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US9966068B2 (en) 2013-06-08 2018-05-08 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
US10769385B2 (en) 2013-06-09 2020-09-08 Apple Inc. System and method for inferring user intent from speech inputs
US10185542B2 (en) 2013-06-09 2019-01-22 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US11048473B2 (en) 2013-06-09 2021-06-29 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US11314370B2 (en) 2013-12-06 2022-04-26 Apple Inc. Method for extracting salient dialog usage from live data
USD742258S1 (en) 2014-01-15 2015-11-03 Garmin Switzerland Gmbh Navigation device
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US10497365B2 (en) 2014-05-30 2019-12-03 Apple Inc. Multi-command single utterance input method
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9842101B2 (en) 2014-05-30 2017-12-12 Apple Inc. Predictive conversion of language input
US10083690B2 (en) 2014-05-30 2018-09-25 Apple Inc. Better resolution when referencing to concepts
US10714095B2 (en) 2014-05-30 2020-07-14 Apple Inc. Intelligent assistant for home automation
US10657966B2 (en) 2014-05-30 2020-05-19 Apple Inc. Better resolution when referencing to concepts
US10699717B2 (en) 2014-05-30 2020-06-30 Apple Inc. Intelligent assistant for home automation
US10078631B2 (en) 2014-05-30 2018-09-18 Apple Inc. Entropy-guided text prediction using combined word and character n-gram language models
US11133008B2 (en) 2014-05-30 2021-09-28 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US10417344B2 (en) 2014-05-30 2019-09-17 Apple Inc. Exemplar-based natural language processing
US11257504B2 (en) 2014-05-30 2022-02-22 Apple Inc. Intelligent assistant for home automation
US9760559B2 (en) 2014-05-30 2017-09-12 Apple Inc. Predictive text input
US9785630B2 (en) 2014-05-30 2017-10-10 Apple Inc. Text prediction using combined word N-gram and unigram language models
US10169329B2 (en) 2014-05-30 2019-01-01 Apple Inc. Exemplar-based natural language processing
US10878809B2 (en) 2014-05-30 2020-12-29 Apple Inc. Multi-command single utterance input method
US10659851B2 (en) 2014-06-30 2020-05-19 Apple Inc. Real-time digital assistant knowledge updates
US10904611B2 (en) 2014-06-30 2021-01-26 Apple Inc. Intelligent automated assistant for TV user interactions
US9668024B2 (en) 2014-06-30 2017-05-30 Apple Inc. Intelligent automated assistant for TV user interactions
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US10446141B2 (en) 2014-08-28 2019-10-15 Apple Inc. Automatic speech recognition based on user feedback
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10431204B2 (en) 2014-09-11 2019-10-01 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10789041B2 (en) 2014-09-12 2020-09-29 Apple Inc. Dynamic thresholds for always listening speech trigger
US10453443B2 (en) 2014-09-30 2019-10-22 Apple Inc. Providing an indication of the suitability of speech recognition
US9646609B2 (en) 2014-09-30 2017-05-09 Apple Inc. Caching apparatus for serving phonetic pronunciations
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9986419B2 (en) 2014-09-30 2018-05-29 Apple Inc. Social reminders
US9886432B2 (en) 2014-09-30 2018-02-06 Apple Inc. Parsimonious handling of word inflection via categorical stem + suffix N-gram language models
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10390213B2 (en) 2014-09-30 2019-08-20 Apple Inc. Social reminders
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10438595B2 (en) 2014-09-30 2019-10-08 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US20160155211A1 (en) * 2014-11-28 2016-06-02 Kyocera Document Solutions Inc. Display device and recording medium used for on-screen operation such as swipe operation
US10552013B2 (en) 2014-12-02 2020-02-04 Apple Inc. Data detection
US11556230B2 (en) 2014-12-02 2023-01-17 Apple Inc. Data detection
US9648181B2 (en) 2014-12-17 2017-05-09 Kyocera Document Solutions Inc. Touch panel device and image processing apparatus
USD751933S1 (en) 2014-12-22 2016-03-22 Garmin Switzerland Gmbh Navigation device
US9865280B2 (en) 2015-03-06 2018-01-09 Apple Inc. Structured dictation using intelligent automated assistants
US11231904B2 (en) 2015-03-06 2022-01-25 Apple Inc. Reducing response latency of intelligent automated assistants
US10529332B2 (en) 2015-03-08 2020-01-07 Apple Inc. Virtual assistant activation
US11087759B2 (en) 2015-03-08 2021-08-10 Apple Inc. Virtual assistant activation
US10311871B2 (en) 2015-03-08 2019-06-04 Apple Inc. Competing devices responding to voice triggers
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US10930282B2 (en) 2015-03-08 2021-02-23 Apple Inc. Competing devices responding to voice triggers
US9899019B2 (en) 2015-03-18 2018-02-20 Apple Inc. Systems and methods for structured stem and suffix language models
US9842105B2 (en) 2015-04-16 2017-12-12 Apple Inc. Parsimonious continuous-space phrase representations for natural language processing
US11468282B2 (en) 2015-05-15 2022-10-11 Apple Inc. Virtual assistant in a communication session
US11127397B2 (en) 2015-05-27 2021-09-21 Apple Inc. Device voice control
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US10127220B2 (en) 2015-06-04 2018-11-13 Apple Inc. Language identification from short strings
US10356243B2 (en) 2015-06-05 2019-07-16 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US10101822B2 (en) 2015-06-05 2018-10-16 Apple Inc. Language input correction
US10681212B2 (en) 2015-06-05 2020-06-09 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US10186254B2 (en) 2015-06-07 2019-01-22 Apple Inc. Context-based endpoint detection
US10255907B2 (en) 2015-06-07 2019-04-09 Apple Inc. Automatic accent detection using acoustic models
US11010127B2 (en) 2015-06-29 2021-05-18 Apple Inc. Virtual assistant for media playback
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US11500672B2 (en) 2015-09-08 2022-11-15 Apple Inc. Distributed personal assistant
US9697820B2 (en) 2015-09-24 2017-07-04 Apple Inc. Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en) 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
US11526368B2 (en) 2015-11-06 2022-12-13 Apple Inc. Intelligent automated assistant in a messaging environment
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10354652B2 (en) 2015-12-02 2019-07-16 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10942703B2 (en) 2015-12-23 2021-03-09 Apple Inc. Proactive assistance based on dialog communication between devices
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US11069347B2 (en) 2016-06-08 2021-07-20 Apple Inc. Intelligent automated assistant for media exploration
US10354011B2 (en) 2016-06-09 2019-07-16 Apple Inc. Intelligent automated assistant in a home environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10733993B2 (en) 2016-06-10 2020-08-04 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US11037565B2 (en) 2016-06-10 2021-06-15 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
US10521466B2 (en) 2016-06-11 2019-12-31 Apple Inc. Data driven natural language event detection and classification
US10942702B2 (en) 2016-06-11 2021-03-09 Apple Inc. Intelligent device arbitration and control
US10269345B2 (en) 2016-06-11 2019-04-23 Apple Inc. Intelligent task discovery
US10089072B2 (en) 2016-06-11 2018-10-02 Apple Inc. Intelligent device arbitration and control
US10297253B2 (en) 2016-06-11 2019-05-21 Apple Inc. Application integration with a digital assistant
US10580409B2 (en) 2016-06-11 2020-03-03 Apple Inc. Application integration with a digital assistant
US11152002B2 (en) 2016-06-11 2021-10-19 Apple Inc. Application integration with a digital assistant
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10553215B2 (en) 2016-09-23 2020-02-04 Apple Inc. Intelligent automated assistant
US10346118B2 (en) 2016-10-06 2019-07-09 Toyota Jidosha Kabushiki Kaisha On-vehicle operation device
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US11656884B2 (en) 2017-01-09 2023-05-23 Apple Inc. Application integration with a digital assistant
US10332518B2 (en) 2017-05-09 2019-06-25 Apple Inc. User interface for correcting recognition errors
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
US10741181B2 (en) 2017-05-09 2020-08-11 Apple Inc. User interface for correcting recognition errors
US10847142B2 (en) 2017-05-11 2020-11-24 Apple Inc. Maintaining privacy of personal information
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US10755703B2 (en) 2017-05-11 2020-08-25 Apple Inc. Offline personal assistant
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
US10789945B2 (en) 2017-05-12 2020-09-29 Apple Inc. Low-latency intelligent automated assistant
US10791176B2 (en) 2017-05-12 2020-09-29 Apple Inc. Synchronization and task delegation of a digital assistant
US10410637B2 (en) 2017-05-12 2019-09-10 Apple Inc. User-specific acoustic models
US11405466B2 (en) 2017-05-12 2022-08-02 Apple Inc. Synchronization and task delegation of a digital assistant
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
US10482874B2 (en) 2017-05-15 2019-11-19 Apple Inc. Hierarchical belief states for digital assistants
US10810274B2 (en) 2017-05-15 2020-10-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
US11217255B2 (en) 2017-05-16 2022-01-04 Apple Inc. Far-field extension for digital assistant services
US10909171B2 (en) 2017-05-16 2021-02-02 Apple Inc. Intelligent automated assistant for media exploration
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
US10748546B2 (en) 2017-05-16 2020-08-18 Apple Inc. Digital assistant services based on device capabilities
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
WO2019062243A1 (en) * 2017-09-26 2019-04-04 出门问问信息科技有限公司 Identification method and apparatus for touch operation, and electronic device
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
US10720160B2 (en) 2018-06-01 2020-07-21 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
US11009970B2 (en) 2018-06-01 2021-05-18 Apple Inc. Attention aware virtual assistant dismissal
US10684703B2 (en) 2018-06-01 2020-06-16 Apple Inc. Attention aware virtual assistant dismissal
US10984798B2 (en) 2018-06-01 2021-04-20 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US11495218B2 (en) 2018-06-01 2022-11-08 Apple Inc. Virtual assistant operation in multi-device environments
US10403283B1 (en) 2018-06-01 2019-09-03 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
US10504518B1 (en) 2018-06-03 2019-12-10 Apple Inc. Accelerated task performance
US10944859B2 (en) 2018-06-03 2021-03-09 Apple Inc. Accelerated task performance
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
US11217251B2 (en) 2019-05-06 2022-01-04 Apple Inc. Spoken notifications
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
US11360739B2 (en) 2019-05-31 2022-06-14 Apple Inc. User activity shortcut suggestions
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
US11237797B2 (en) 2019-05-31 2022-02-01 Apple Inc. User activity shortcut suggestions
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators

Also Published As

Publication number Publication date
US20130300698A1 (en) 2013-11-14
JP2009151691A (en) 2009-07-09
US10168888B2 (en) 2019-01-01
JP5239328B2 (en) 2013-07-17

Similar Documents

Publication Publication Date Title
US10168888B2 (en) Information processing device and touch operation detection method
US8477096B2 (en) Display apparatus and method of controlling same
JP5379259B2 (en) Screen display device
US8731820B2 (en) Method and apparatus for keyboard arrangement for efficient data entry for navigation system
US9292093B2 (en) Interface method and apparatus for inputting information with air finger gesture
US9477400B2 (en) Method and apparatus for navigation system for selecting icons and application area by hand drawing on map image
US7577518B2 (en) Navigation system
JP4574329B2 (en) Image display device
US7792635B2 (en) Multi-function navigation system
CN105283356A (en) Program, method, and device for controlling application, and recording medium
JPH1096648A (en) Information indicator with touch panel
KR20100129627A (en) Mobile vehicle navigation method and apparatus thereof
JP2008180786A (en) Navigation system and navigation device
JP2007003328A (en) Car navigation system
JP2010128685A (en) Electronic equipment
EP2838002A1 (en) Display device
JP2003166843A (en) Navigation device
JP2004126842A (en) Image processor
JP2002350151A (en) Navigation device
KR101542495B1 (en) Method for displaying information for mobile terminal and apparatus thereof
KR100861353B1 (en) Navigation system with multifunction and opertation method thereof
JP5461030B2 (en) Input device
EP1901038B1 (en) Map display system and navigation system
JP4161768B2 (en) Electronic device and program with dictionary function
JP2004325322A (en) Navigation system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HASHIMOTO, HIROKAZU;REEL/FRAME:021922/0030

Effective date: 20081118

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION