US20070262970A1 - Input operation system - Google Patents
Input operation system Download PDFInfo
- Publication number
- US20070262970A1 US20070262970A1 US11/798,183 US79818307A US2007262970A1 US 20070262970 A1 US20070262970 A1 US 20070262970A1 US 79818307 A US79818307 A US 79818307A US 2007262970 A1 US2007262970 A1 US 2007262970A1
- Authority
- US
- United States
- Prior art keywords
- touch panel
- touching finger
- user
- image
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention generally relates to an input operation system for use in a vehicle.
- Japanese patent document JP-A-H07-319623 discloses a touch panel that has a transparent sheet disposed thereon with a protrusion for indicating a button position.
- Japanese patent document JP-A-H10-269012 discloses an apparatus including a touch panel for sensing a touching finger of a user in combination with a display unit for displaying a switch operation screen with a superposed outline of a touching finger image. The outline image of the touching finger is derived from imaging the touching finger actually touching the separately positioned touch panel for providing a no-lookaway operability of the switch operation screen on the touch panel.
- Japanese patent document JP-A-H07-319623 improves user operability for indicating the button position in the switch operation screen by having the transparent sheet on the touch panel.
- menus in the switch operation screen is hierarchically structured and variably arranged for controlling, for example, a navigation system, having plural transparent sheets for accommodating respective menus is difficult.
- Japanese patent document JP-A-H10-269012 not only requires the protrusion on the surface of the touch panel, but also requires an associated operation of the touch panel and the switch operation screen due to its separate positioning of the display unit away from the touch panel.
- a flat surface of the touch panel without any positioning structure for the touching finger requires user's look and attention distracted away from a traveling direction of a vehicle, thereby deteriorating the operability of the touch panel.
- the present invention provides an input operation system that provides an improved operability for a user who uses a touch panel for controlling vehicular devices in a vehicle.
- the input operation system for use in a vehicle includes a touch panel that senses a touching finger of a user, an imaging unit that images the touch panel, an outline extraction unit that extracts an outline image of the touching finger of the user based on an image of the touch panel, an image generation unit that generates a composite image by superposing the outline image of the touching finger extracted by the outline extraction unit on a switch operation screen, and a display control unit that displays the composite image generated by the image generation unit on a display panel.
- the touch panel of the input operation system has a touch sensitive surface having a protrusion disposed thereon, and the image generation unit generates the composite image by superposing the outline image of the touching finger on the switch operation screen that is backed by an image of the protrusion.
- the protrusion on the surface of the touch panel is displayed as a background of the switch operation screen, thereby enabling the user to intuitively position the touching finger relative to a menu item of the switch operation screen without looking at the touch panel for an improved operability
- the touch panel is automatically lit up by a backlight after a certain period of out-of-contact condition with a vehicular device such as a steering wheel or the like, thereby enabling the user to easily located the touch panel, based on an assistance of an operation detection unit and a lamp control unit for detecting the condition and controlling the backlight.
- FIG. 1 shows a block diagram of an input operation system in an embodiment of the present disclosure
- FIGS. 2A and 2B show illustrations of a touch panel of the input operation system
- FIGS. 3A and 3B show illustrations of the touch panel under control of a user's hand and an outline image superposed on a switch operation screen;
- FIGS. 4A and 4B show illustrations of the touch panel with markers formed thereon
- FIG. 5 shows a block diagram of the input operation system in another embodiment of the present disclosure.
- FIG. 6 shows a flowchart of a control process of the input operation system in the another embodiment of the present disclosure.
- the input operation system 1 is for use in a vehicle.
- the system 1 includes a touch switch 2 , a camera 3 , a control unit 4 , a display 5 , and a navigation unit 6 .
- the touch switch 2 includes a touch panel 7 for sensing a touching finger of a user, and a backlight 8 for lighting the touch panel 7 .
- the touch switch 2 is, for example, disposed on a center console or the like for an access by the user in a driving position with his/her left hand without releasing a right hand from a steering wheel.
- the switch 2 outputs, to the control unit 4 , position coordinates that represents a position of the touching finger on the touch panel 7 .
- a sensing surface 7 a of the touch panel 7 has a logotype of DENSO formed thereon as shown in FIG. 2A .
- the logotype of ‘D’‘E’‘N’‘S’‘O’ is formed in a protruding manner on the surface 7 a , and respective letters of ‘D’ 9 , ‘E’ 10 , ‘N’ 11 , ‘S’ 12 , ‘O’ 13 (protrusions 9 - 13 ) have a predetermined protrusion amount that does not obstruct smooth transitional movement of the touching finger on the surface 7 a.
- the camera 3 is a CCD camera, CMOS camera or the like in the present embodiment, and is disposed on a roof in the vehicle, a back of an instrument panel/a backlight 8 or the like. In this manner, the camera 3 images an entire area of the touch panel 7 , and outputs image data to the control unit 4 .
- the camera 3 may also be an infrared camera or the like.
- the display 5 is, for example, a liquid crystal display positioned in a normal forward view of a driver, and displays a composite image that is inputted from the control unit 4 .
- the size of the display 5 is preferably the same as the size of the touch panel 7 .
- the navigation unit 6 outputs a switch operation screen such as a menu screen, a destination setting screen, a search screen or the like to the control unit 4 , and performs an operation according to a switch control signal from the control unit 4 when the switch control signal is provided.
- the control unit 4 includes a CPU, a RAM, a ROM and other components, and the respective components perform their own function for rendering functions of an outline extraction processing module, a touch detection processing module, a composite image generation processing module, and a display control processing module. In this manner, the control unit 4 controls an entire operation of the input operation system 1 . More practically, the control unit 4 executes the outline extraction processing module upon having an input of the image data from the camera 3 , and extracts an outline of the touching finger of the user's hand based on the image of the user's hand. The image of the touching finger of the user's hand is processed by using, for example, Chromakey process. In this case, the surface color of the touch panel 7 is set to have the color of blue as a complementary color of skin, and the backlight 8 lights the touch panel 7 . In this manner, an influence of ambient light is decreased.
- the control unit 4 executes the touch detection processing module upon having an input of the position coordinates of the touching finger from the touch switch 7 for detecting a touching position of the user's hand. Further, the control unit 4 executes the composite image generation module for generating a composite image of the switch operation screen by having the outline image of the touching finger superposed on the switch operation screen with the DENSO logo background. That is, the composite image displays the positions of the protrusions 9 - 13 on the surface 7 a of the touch panel 7 as a background image of the switch operation screen. Then, the control unit 4 executes the display control processing module to display the composite image on the display 5 as shown in FIG. 2B .
- the positions (i.e., the coordinates) of the protrusions 9 - 13 of the DENSO logo in the touch panel 7 are substantially same as the positions (i.e., the coordinates) of a display position of the DENSO logo on the display 5 .
- the user is enabled to have a tactile feedback from the touching finger on the touch panel 7 without actually looking at the touch panel 7 as shown in FIGS. 3A and 3B .
- the DENSO logo in the background image provides the user with a clue for positioning the touching finger at a right controlling position of respective menu items on the switch operation screen relative to the position of protrusions 9 - 13 formed on the surface 7 a of the touch panel 7 .
- the protrusions 9 - 13 formed on the surface 7 a of the touch panel 7 may be replaced with marking protrusions 14 - 16 of a star, a disk, and a square as shown in FIG. 4A .
- the background image of the switch operation screen in the composite image is changed to the background image with corresponding shapes of the marking protrusions 14 - 16 as shown in FIG. 4B .
- the positions of the protrusions 9 - 16 in the image derived from the camera 3 may be utilized for accurately positioning the outline image of the user's hand in the composite image based on an analysis that compares the positions of the protrusions 9 - 16 in the image data from the camera 3 with the positions of the protrusions 9 - 16 in the composite image outputted to the display 5 .
- the colors of the protrusions 9 - 16 may be changed from the surface color of the touch panel 7 for securely extracting the positions of the protrusions 9 - 16 .
- the input operation system 1 for use in the vehicle has the protrusions 9 - 16 on the surface 7 a of the touch panel 7 in association with the protrusion-marked background image of the switch operation screen in the composite image, thereby providing the user with an improved operability based on an intuitive tactile feedback from the touching finger for positioning respective menu items in the touch panel 7 without a glance.
- the protrusions 9 - 16 formed on the surface 7 a of the touch panel 7 provides an aesthetic improvement by being in an industrially and ergonomically reasonable shape.
- An input operation system 21 in the present embodiment includes a touch switch 22 , a camera 23 , a control unit 24 , a display 25 , and a navigation unit 26 .
- the touch switch 22 includes a touch panel 27 and a backlight 28 , and outputs to the control unit 24 an operation detection signal that represents either of an in-contact condition or an out-of-contact condition of the touching finger of the user with the touch panel 27 .
- the touch panel 27 also outputs contact position coordinates of the touching finger to the control unit 24 when the touching finger is touching the touch panel 27 .
- the touch switch 22 lights itself by the backlight 28 according to a lamp control signal upon having an input of the lamp control signal from the control unit 24 .
- a steering wheel 29 for steering the vehicle has an operation detection sensor 30 for detecting an in-contact condition and an out-of-contact condition of the touching finger with the steering wheel 29 .
- the operation detection sensor 30 outputs an operation detection signal that represents respective contact conditions of the touching finger of the user with the steering wheel 29 to the control unit 24 .
- the control unit 24 executes a steering operation detection module, a touch panel operation detection module, and a lamp control module beside executing other modules described in the first embodiment.
- the control unit 24 determines whether the touching finger of the user is in the in-contact or the out-of-contact condition with the steering wheel 29 by executing the steering operation detection module for analyzing the operation detection signal from the operation detection sensor 30 , and also determines whether the touching finger of the user is in the in-contact or the out-of-contact condition with the touch panel 22 by executing the touch panel operation detection module for analyzing the operation detection signal from the touch switch 22 . Further, the control unit 24 controls lighting operation of the backlight 28 by executing the backlight control module.
- a control process of the input operation system 21 in the present embodiment is described with reference to a flowchart in FIG. 6 .
- the control unit 24 determines whether the touching finger of the user has lost contact with the steering wheel 29 in step S 1 , and start a clock to measure a first lapse time in step S 2 when the user is out of contact with the steering wheel 29 is detected (step S 1 :YES) based on the operation detection signal from the detection sensor 30 . Then, in step S 3 , whether a preset first lapse time has passed is determined, and whether the touching finger of the user is touching the steering wheel 29 again is determined in step S 4 .
- step S 3 When the control unit 24 detects that the preset first lapse time has passed before detecting the contact of the touching finger of the user with the steering wheel 29 (step S 3 :YES), that is, when the out-of-contact condition of the touching finger with the steering wheel 29 has exceeded the preset first lapse time, the control unit 24 outputs the lamp control signal to the touch switch 22 for turning on the backlight 28 to a normal level of brightness in step S 5 . Then, the control unit 24 starts the clock to measure a second and third lapse time in steps S 6 and S 7 , and determines whether the touching finger has contacted the touch panel 27 in step S 8 as well as determines whether the preset second lapse time has passed in step S 9 . In this case, the present third lapse time is configured to be longer than the preset second lapse time.
- step S 8 When the control unit 24 detects that the preset second lapse time has passed before detecting the contact of the touching finger of the user with the touch panel 27 (step S 8 :YES), that is, when the out-of-contact condition of the touching finger with the touch panel 27 after turning-on of the backlight 28 in the normal level of brightness has exceeded the preset second lapse time, the control unit 24 determines whether the brightness of the backlight 28 has reached the maximum level in step S 10 , and increases the brightness of the backlight 28 by one level in step S 11 upon detecting that the brightness of the backlight 28 has not reached the maximum level (step S 10 :NO). Then, the control unit 24 resumes measurement of the second lapse time in step S 12 , and returns to step S 8 for continuing the control process.
- step S 8 When the control unit 24 detects that the touching finger of the user has contacted the touch panel 27 before the elapse of the preset second lapse time (step S 8 :YES), that is, when the user has contacted the touch panel 27 with the touching finger before the elapse of the preset second time after turning-on of the backlight 28 in the normal level of brightness, the control unit determines the brightness level of backlight 28 at the time of the detection of the contact with the touch panel in step S 13 . When the brightness level of the backlight 28 is not in the normal level (step S 13 :NO), the control unit 24 puts the brightness level of the backlight 28 back to the normal level in step S 14 .
- control unit 24 determines whether the touching finger of the user has lost contact with the touch panel 27 in step S 15 and whether the touching finger of the user has contacted with the steering wheel in step S 16 , and concludes the input system control process with turning-off of the backlight 28 in step S 17 after having a determination result in step S 15 that the user has lost contact with the touch panel 17 (step S 15 :YES) and a determination result in step S 16 that the user has contacted the touching finger with the steering wheel 29 (step S 16 :YES).
- the operation system 21 lights the touch panel 27 with the backlight 28 lit in the normal brightness level after the preset first lapse time of the loss of the contact of the touching finger with the steering wheel 29 , thereby providing an improved operability of the touch panel in terms of positioning the touch panel 27 without causing a look-away from a front direction of the vehicle, or with a minimum amount of a look-away from the front direction.
- the brightness of the backlight 28 is configured to be back to the normal level after the preset second lapse time of the out-of-contact condition of the touching finger with the touch panel 27 for suppressing unnecessary battery consumption.
- unnecessary battery consumption is further suppressed by having the preset third lapse time for turning off the backlight upon detecting the out-of-contact condition of the touching finger with the touch panel or upon detecting the combined conditions of the loss-of-contact with the touch panel 27 and the contact with the steering wheel 29 by the touching finger.
- control units 1 and 21 , the displays 5 and 25 , the navigation units 6 and 26 may be integrally formed as a navigation system of the vehicle.
- the logos and protrusions in the first embodiment may have other forms and other quality.
- the operation detection sensor may be provided on a shift knob in the second embodiment, and the backlight control may be provided based on the in- and out-of-contact conditions of the touching finger with the shift knob in combination with the contact conditions with the steering wheel.
Abstract
Description
- This application is based on and claims the benefit of priority of Japanese Patent Application No. 2006-133879 filed on May 12, 2006, the disclosure of which is incorporated herein by reference.
- The present invention generally relates to an input operation system for use in a vehicle.
- In recent years, various techniques are disclosed for easily locating a button in a switch operation screen displayed on a touch sensitive panel of a display unit. For example, Japanese patent document JP-A-H07-319623 discloses a touch panel that has a transparent sheet disposed thereon with a protrusion for indicating a button position. Further, Japanese patent document JP-A-H10-269012 discloses an apparatus including a touch panel for sensing a touching finger of a user in combination with a display unit for displaying a switch operation screen with a superposed outline of a touching finger image. The outline image of the touching finger is derived from imaging the touching finger actually touching the separately positioned touch panel for providing a no-lookaway operability of the switch operation screen on the touch panel.
- The disclosure in Japanese patent document JP-A-H07-319623 improves user operability for indicating the button position in the switch operation screen by having the transparent sheet on the touch panel. However, when menus in the switch operation screen is hierarchically structured and variably arranged for controlling, for example, a navigation system, having plural transparent sheets for accommodating respective menus is difficult.
- Further, the disclosure in Japanese patent document JP-A-H10-269012 not only requires the protrusion on the surface of the touch panel, but also requires an associated operation of the touch panel and the switch operation screen due to its separate positioning of the display unit away from the touch panel. In addition, a flat surface of the touch panel without any positioning structure for the touching finger requires user's look and attention distracted away from a traveling direction of a vehicle, thereby deteriorating the operability of the touch panel.
- In view of the above and other problems, the present invention provides an input operation system that provides an improved operability for a user who uses a touch panel for controlling vehicular devices in a vehicle.
- In one aspect of the present invention, the input operation system for use in a vehicle includes a touch panel that senses a touching finger of a user, an imaging unit that images the touch panel, an outline extraction unit that extracts an outline image of the touching finger of the user based on an image of the touch panel, an image generation unit that generates a composite image by superposing the outline image of the touching finger extracted by the outline extraction unit on a switch operation screen, and a display control unit that displays the composite image generated by the image generation unit on a display panel. The touch panel of the input operation system has a touch sensitive surface having a protrusion disposed thereon, and the image generation unit generates the composite image by superposing the outline image of the touching finger on the switch operation screen that is backed by an image of the protrusion. In this manner, the protrusion on the surface of the touch panel is displayed as a background of the switch operation screen, thereby enabling the user to intuitively position the touching finger relative to a menu item of the switch operation screen without looking at the touch panel for an improved operability
- In another aspect of the present invention, the touch panel is automatically lit up by a backlight after a certain period of out-of-contact condition with a vehicular device such as a steering wheel or the like, thereby enabling the user to easily located the touch panel, based on an assistance of an operation detection unit and a lamp control unit for detecting the condition and controlling the backlight.
- Other objects, features and advantages of the present invention will become more apparent from the following detailed description made with reference to the accompanying drawings, in which:
-
FIG. 1 shows a block diagram of an input operation system in an embodiment of the present disclosure; -
FIGS. 2A and 2B show illustrations of a touch panel of the input operation system; -
FIGS. 3A and 3B show illustrations of the touch panel under control of a user's hand and an outline image superposed on a switch operation screen; -
FIGS. 4A and 4B show illustrations of the touch panel with markers formed thereon; -
FIG. 5 shows a block diagram of the input operation system in another embodiment of the present disclosure; and -
FIG. 6 shows a flowchart of a control process of the input operation system in the another embodiment of the present disclosure. - Embodiments of the present invention will now be described with reference to the accompanying the drawings.
- A first embodiment of an
input operation system 1 in the present disclosure is described with reference toFIGS. 1 to 4B . Theinput operation system 1 is for use in a vehicle. Thesystem 1 includes atouch switch 2, acamera 3, acontrol unit 4, adisplay 5, and anavigation unit 6. - The
touch switch 2 includes atouch panel 7 for sensing a touching finger of a user, and abacklight 8 for lighting thetouch panel 7. Thetouch switch 2 is, for example, disposed on a center console or the like for an access by the user in a driving position with his/her left hand without releasing a right hand from a steering wheel. When thetouch switch 2 is sensing the touching finger, theswitch 2 outputs, to thecontrol unit 4, position coordinates that represents a position of the touching finger on thetouch panel 7. In this case, asensing surface 7 a of thetouch panel 7 has a logotype of DENSO formed thereon as shown inFIG. 2A . The logotype of ‘D’‘E’‘N’‘S’‘O’ is formed in a protruding manner on thesurface 7 a, and respective letters of ‘D’ 9, ‘E’ 10, ‘N’ 11, ‘S’ 12, ‘O’ 13 (protrusions 9-13) have a predetermined protrusion amount that does not obstruct smooth transitional movement of the touching finger on thesurface 7 a. - The
camera 3 is a CCD camera, CMOS camera or the like in the present embodiment, and is disposed on a roof in the vehicle, a back of an instrument panel/abacklight 8 or the like. In this manner, thecamera 3 images an entire area of thetouch panel 7, and outputs image data to thecontrol unit 4. Thecamera 3 may also be an infrared camera or the like. - The
display 5 is, for example, a liquid crystal display positioned in a normal forward view of a driver, and displays a composite image that is inputted from thecontrol unit 4. The size of thedisplay 5 is preferably the same as the size of thetouch panel 7. Thenavigation unit 6 outputs a switch operation screen such as a menu screen, a destination setting screen, a search screen or the like to thecontrol unit 4, and performs an operation according to a switch control signal from thecontrol unit 4 when the switch control signal is provided. - The
control unit 4 includes a CPU, a RAM, a ROM and other components, and the respective components perform their own function for rendering functions of an outline extraction processing module, a touch detection processing module, a composite image generation processing module, and a display control processing module. In this manner, thecontrol unit 4 controls an entire operation of theinput operation system 1. More practically, thecontrol unit 4 executes the outline extraction processing module upon having an input of the image data from thecamera 3, and extracts an outline of the touching finger of the user's hand based on the image of the user's hand. The image of the touching finger of the user's hand is processed by using, for example, Chromakey process. In this case, the surface color of thetouch panel 7 is set to have the color of blue as a complementary color of skin, and thebacklight 8 lights thetouch panel 7. In this manner, an influence of ambient light is decreased. - The
control unit 4 executes the touch detection processing module upon having an input of the position coordinates of the touching finger from thetouch switch 7 for detecting a touching position of the user's hand. Further, thecontrol unit 4 executes the composite image generation module for generating a composite image of the switch operation screen by having the outline image of the touching finger superposed on the switch operation screen with the DENSO logo background. That is, the composite image displays the positions of the protrusions 9-13 on thesurface 7 a of thetouch panel 7 as a background image of the switch operation screen. Then, thecontrol unit 4 executes the display control processing module to display the composite image on thedisplay 5 as shown inFIG. 2B . In this case, the positions (i.e., the coordinates) of the protrusions 9-13 of the DENSO logo in thetouch panel 7 are substantially same as the positions (i.e., the coordinates) of a display position of the DENSO logo on thedisplay 5. - When the protrusions 9-13 on the
surface 7 a of thetouch panel 7 are displayed as the background image of the switch operation screen, the user is enabled to have a tactile feedback from the touching finger on thetouch panel 7 without actually looking at thetouch panel 7 as shown inFIGS. 3A and 3B . More practically, the DENSO logo in the background image provides the user with a clue for positioning the touching finger at a right controlling position of respective menu items on the switch operation screen relative to the position of protrusions 9-13 formed on thesurface 7 a of thetouch panel 7. - The protrusions 9-13 formed on the
surface 7 a of thetouch panel 7 may be replaced with marking protrusions 14-16 of a star, a disk, and a square as shown inFIG. 4A . In this case, the background image of the switch operation screen in the composite image is changed to the background image with corresponding shapes of the marking protrusions 14-16 as shown inFIG. 4B . - Further, the positions of the protrusions 9-16 in the image derived from the
camera 3 may be utilized for accurately positioning the outline image of the user's hand in the composite image based on an analysis that compares the positions of the protrusions 9-16 in the image data from thecamera 3 with the positions of the protrusions 9-16 in the composite image outputted to thedisplay 5. In this case, the colors of the protrusions 9-16 may be changed from the surface color of thetouch panel 7 for securely extracting the positions of the protrusions 9-16. - The advantage of the
input operation system 1 in the present embodiment is summarized in the following. That is, theinput operation system 1 for use in the vehicle has the protrusions 9-16 on thesurface 7 a of thetouch panel 7 in association with the protrusion-marked background image of the switch operation screen in the composite image, thereby providing the user with an improved operability based on an intuitive tactile feedback from the touching finger for positioning respective menu items in thetouch panel 7 without a glance. Further, the protrusions 9-16 formed on thesurface 7 a of thetouch panel 7 provides an aesthetic improvement by being in an industrially and ergonomically reasonable shape. - A second embodiment of the present disclosure is described with reference to
FIGS. 5 and 6 . The description of the second embodiment is mainly focused to the difference of the second embodiment relative to the first embodiment. Aninput operation system 21 in the present embodiment includes atouch switch 22, acamera 23, acontrol unit 24, adisplay 25, and anavigation unit 26. - The
touch switch 22 includes atouch panel 27 and abacklight 28, and outputs to thecontrol unit 24 an operation detection signal that represents either of an in-contact condition or an out-of-contact condition of the touching finger of the user with thetouch panel 27. Thetouch panel 27 also outputs contact position coordinates of the touching finger to thecontrol unit 24 when the touching finger is touching thetouch panel 27. Thetouch switch 22 lights itself by thebacklight 28 according to a lamp control signal upon having an input of the lamp control signal from thecontrol unit 24. - A
steering wheel 29 for steering the vehicle has anoperation detection sensor 30 for detecting an in-contact condition and an out-of-contact condition of the touching finger with thesteering wheel 29. Theoperation detection sensor 30 outputs an operation detection signal that represents respective contact conditions of the touching finger of the user with thesteering wheel 29 to thecontrol unit 24. - The
control unit 24 executes a steering operation detection module, a touch panel operation detection module, and a lamp control module beside executing other modules described in the first embodiment. Thecontrol unit 24 determines whether the touching finger of the user is in the in-contact or the out-of-contact condition with thesteering wheel 29 by executing the steering operation detection module for analyzing the operation detection signal from theoperation detection sensor 30, and also determines whether the touching finger of the user is in the in-contact or the out-of-contact condition with thetouch panel 22 by executing the touch panel operation detection module for analyzing the operation detection signal from thetouch switch 22. Further, thecontrol unit 24 controls lighting operation of thebacklight 28 by executing the backlight control module. - A control process of the
input operation system 21 in the present embodiment is described with reference to a flowchart inFIG. 6 . - The
control unit 24 determines whether the touching finger of the user has lost contact with thesteering wheel 29 in step S1, and start a clock to measure a first lapse time in step S2 when the user is out of contact with thesteering wheel 29 is detected (step S1:YES) based on the operation detection signal from thedetection sensor 30. Then, in step S3, whether a preset first lapse time has passed is determined, and whether the touching finger of the user is touching thesteering wheel 29 again is determined in step S4. - When the
control unit 24 detects that the preset first lapse time has passed before detecting the contact of the touching finger of the user with the steering wheel 29 (step S3:YES), that is, when the out-of-contact condition of the touching finger with thesteering wheel 29 has exceeded the preset first lapse time, thecontrol unit 24 outputs the lamp control signal to thetouch switch 22 for turning on thebacklight 28 to a normal level of brightness in step S5. Then, thecontrol unit 24 starts the clock to measure a second and third lapse time in steps S6 and S7, and determines whether the touching finger has contacted thetouch panel 27 in step S8 as well as determines whether the preset second lapse time has passed in step S9. In this case, the present third lapse time is configured to be longer than the preset second lapse time. - When the
control unit 24 detects that the preset second lapse time has passed before detecting the contact of the touching finger of the user with the touch panel 27 (step S8:YES), that is, when the out-of-contact condition of the touching finger with thetouch panel 27 after turning-on of thebacklight 28 in the normal level of brightness has exceeded the preset second lapse time, thecontrol unit 24 determines whether the brightness of thebacklight 28 has reached the maximum level in step S10, and increases the brightness of thebacklight 28 by one level in step S11 upon detecting that the brightness of thebacklight 28 has not reached the maximum level (step S10:NO). Then, thecontrol unit 24 resumes measurement of the second lapse time in step S12, and returns to step S8 for continuing the control process. - When the
control unit 24 detects that the touching finger of the user has contacted thetouch panel 27 before the elapse of the preset second lapse time (step S8:YES), that is, when the user has contacted thetouch panel 27 with the touching finger before the elapse of the preset second time after turning-on of thebacklight 28 in the normal level of brightness, the control unit determines the brightness level ofbacklight 28 at the time of the detection of the contact with the touch panel in step S13. When the brightness level of thebacklight 28 is not in the normal level (step S13:NO), thecontrol unit 24 puts the brightness level of thebacklight 28 back to the normal level in step S14. - Then, the
control unit 24 determines whether the touching finger of the user has lost contact with thetouch panel 27 in step S15 and whether the touching finger of the user has contacted with the steering wheel in step S16, and concludes the input system control process with turning-off of thebacklight 28 in step S17 after having a determination result in step S15 that the user has lost contact with the touch panel 17 (step S15:YES) and a determination result in step S16 that the user has contacted the touching finger with the steering wheel 29 (step S16:YES). - The advantage of the second embodiment of the
input operation system 21 is summarized in the following. That is, theoperation system 21 lights thetouch panel 27 with thebacklight 28 lit in the normal brightness level after the preset first lapse time of the loss of the contact of the touching finger with thesteering wheel 29, thereby providing an improved operability of the touch panel in terms of positioning thetouch panel 27 without causing a look-away from a front direction of the vehicle, or with a minimum amount of a look-away from the front direction. Further, the brightness of thebacklight 28 is configured to be back to the normal level after the preset second lapse time of the out-of-contact condition of the touching finger with thetouch panel 27 for suppressing unnecessary battery consumption. Furthermore, unnecessary battery consumption is further suppressed by having the preset third lapse time for turning off the backlight upon detecting the out-of-contact condition of the touching finger with the touch panel or upon detecting the combined conditions of the loss-of-contact with thetouch panel 27 and the contact with thesteering wheel 29 by the touching finger. - Although the present invention has been fully described in connection with the preferred embodiment thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art.
- For example, the
control units displays navigation units - The logos and protrusions in the first embodiment may have other forms and other quality.
- The operation detection sensor may be provided on a shift knob in the second embodiment, and the backlight control may be provided based on the in- and out-of-contact conditions of the touching finger with the shift knob in combination with the contact conditions with the steering wheel.
- Such changes and modifications are to be understood as being within the scope of the present invention as defined by the appended claims.
Claims (6)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006133879A JP4635957B2 (en) | 2006-05-12 | 2006-05-12 | In-vehicle operation system |
JP2006-133879 | 2006-05-12 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070262970A1 true US20070262970A1 (en) | 2007-11-15 |
US8164574B2 US8164574B2 (en) | 2012-04-24 |
Family
ID=38684655
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/798,183 Expired - Fee Related US8164574B2 (en) | 2006-05-12 | 2007-05-10 | Touch panel input system for vehicle |
Country Status (2)
Country | Link |
---|---|
US (1) | US8164574B2 (en) |
JP (1) | JP4635957B2 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080231608A1 (en) * | 2007-03-23 | 2008-09-25 | Denso Corporation | Operating input device for reducing input error |
US20100045624A1 (en) * | 2008-08-21 | 2010-02-25 | Denso Corporation | Manipulation input apparatus |
US20100277438A1 (en) * | 2009-04-30 | 2010-11-04 | Denso Corporation | Operation apparatus for in-vehicle electronic device and method for controlling the same |
US20110296340A1 (en) * | 2010-05-31 | 2011-12-01 | Denso Corporation | In-vehicle input system |
US20120154307A1 (en) * | 2010-12-21 | 2012-06-21 | Sony Corporation | Image display control apparatus and image display control method |
US20130076499A1 (en) * | 2011-09-26 | 2013-03-28 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation input apparatus and control method for vehicle operation input apparatus |
CN103026328A (en) * | 2011-05-26 | 2013-04-03 | 松下电器产业株式会社 | Electronic device, and method for editing composite images |
CN103186283A (en) * | 2011-12-27 | 2013-07-03 | 爱信艾达株式会社 | Operation input device |
US20130238585A1 (en) * | 2010-02-12 | 2013-09-12 | Kuo-Ching Chiang | Computing Device with Visual Image Browser |
US20130265228A1 (en) * | 2012-04-05 | 2013-10-10 | Seiko Epson Corporation | Input device, display system and input method |
CN105653755A (en) * | 2015-07-21 | 2016-06-08 | 上海趣驾信息科技有限公司 | Automobile navigation interface designing tool based on SGE graphic base |
US9540016B2 (en) * | 2014-09-26 | 2017-01-10 | Nissan North America, Inc. | Vehicle interface input receiving method |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010049460A (en) * | 2008-08-21 | 2010-03-04 | Denso Corp | Operation input device |
JP2010066899A (en) * | 2008-09-09 | 2010-03-25 | Sony Computer Entertainment Inc | Input device |
US8344870B2 (en) * | 2008-10-07 | 2013-01-01 | Cisco Technology, Inc. | Virtual dashboard |
KR20130067110A (en) * | 2011-12-13 | 2013-06-21 | 현대자동차주식회사 | Menu operation device and method in vehicles |
DE102013220830A1 (en) | 2013-10-15 | 2015-04-16 | Bayerische Motoren Werke Aktiengesellschaft | Input device with touch screen for vehicles |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6008800A (en) * | 1992-09-18 | 1999-12-28 | Pryor; Timothy R. | Man machine interfaces for entering data into a computer |
US6198475B1 (en) * | 1997-06-26 | 2001-03-06 | Kabushiki Kaisha Tokai-Rika-Denki-Seisakusho | Touch operation information output device |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20030096594A1 (en) * | 2001-10-24 | 2003-05-22 | Naboulsi Mouhamad Ahmad | Safety control system for vehicles |
US20030098803A1 (en) * | 2001-09-18 | 2003-05-29 | The Research Foundation Of The City University Of New York | Tactile graphic-based interactive overlay assembly and computer system for the visually impaired |
US6725064B1 (en) * | 1999-07-13 | 2004-04-20 | Denso Corporation | Portable terminal device with power saving backlight control |
US20040195031A1 (en) * | 2001-10-30 | 2004-10-07 | Chikao Nagasaka | Car-mounted device control system |
US20050078085A1 (en) * | 2001-09-07 | 2005-04-14 | Microsoft Corporation | Data input device power management including beacon state |
US20050134117A1 (en) * | 2003-12-17 | 2005-06-23 | Takafumi Ito | Interface for car-mounted devices |
US20060060658A1 (en) * | 2004-09-21 | 2006-03-23 | Proffitt Jerry L | Configurable multi-level thermostat backlighting |
US7084859B1 (en) * | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20070262965A1 (en) * | 2004-09-03 | 2007-11-15 | Takuya Hirai | Input Device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3158513B2 (en) | 1991-08-15 | 2001-04-23 | ヤマハ株式会社 | Touch panel |
JPH07319623A (en) * | 1994-05-25 | 1995-12-08 | Canon Inc | Touch panel type liquid crystal display |
JPH10269012A (en) * | 1997-03-28 | 1998-10-09 | Yazaki Corp | Touch panel controller and information display device using the same |
JP2001033200A (en) | 1999-07-21 | 2001-02-09 | Nishitetsu Kenki Kk | Method and system for boring insertion hole of bomb searcher |
JP2004026046A (en) * | 2002-06-26 | 2004-01-29 | Clarion Co Ltd | Operating device for on-vehicle information equipment |
JP2006003787A (en) * | 2004-06-21 | 2006-01-05 | Fujitsu Ten Ltd | Display control device |
-
2006
- 2006-05-12 JP JP2006133879A patent/JP4635957B2/en not_active Expired - Fee Related
-
2007
- 2007-05-10 US US11/798,183 patent/US8164574B2/en not_active Expired - Fee Related
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6008800A (en) * | 1992-09-18 | 1999-12-28 | Pryor; Timothy R. | Man machine interfaces for entering data into a computer |
US7084859B1 (en) * | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US6198475B1 (en) * | 1997-06-26 | 2001-03-06 | Kabushiki Kaisha Tokai-Rika-Denki-Seisakusho | Touch operation information output device |
US6725064B1 (en) * | 1999-07-13 | 2004-04-20 | Denso Corporation | Portable terminal device with power saving backlight control |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20050078085A1 (en) * | 2001-09-07 | 2005-04-14 | Microsoft Corporation | Data input device power management including beacon state |
US20030098803A1 (en) * | 2001-09-18 | 2003-05-29 | The Research Foundation Of The City University Of New York | Tactile graphic-based interactive overlay assembly and computer system for the visually impaired |
US20030096594A1 (en) * | 2001-10-24 | 2003-05-22 | Naboulsi Mouhamad Ahmad | Safety control system for vehicles |
US20040195031A1 (en) * | 2001-10-30 | 2004-10-07 | Chikao Nagasaka | Car-mounted device control system |
US20050134117A1 (en) * | 2003-12-17 | 2005-06-23 | Takafumi Ito | Interface for car-mounted devices |
US20070262965A1 (en) * | 2004-09-03 | 2007-11-15 | Takuya Hirai | Input Device |
US20060060658A1 (en) * | 2004-09-21 | 2006-03-23 | Proffitt Jerry L | Configurable multi-level thermostat backlighting |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8363010B2 (en) | 2007-03-23 | 2013-01-29 | Denso Corporation | Operating input device for reducing input error |
US20080231608A1 (en) * | 2007-03-23 | 2008-09-25 | Denso Corporation | Operating input device for reducing input error |
US20100045624A1 (en) * | 2008-08-21 | 2010-02-25 | Denso Corporation | Manipulation input apparatus |
US8384687B2 (en) * | 2008-08-21 | 2013-02-26 | Denso Corporation | Manipulation input apparatus |
US8593417B2 (en) * | 2009-04-30 | 2013-11-26 | Denso Corporation | Operation apparatus for in-vehicle electronic device and method for controlling the same |
US20100277438A1 (en) * | 2009-04-30 | 2010-11-04 | Denso Corporation | Operation apparatus for in-vehicle electronic device and method for controlling the same |
US20130238585A1 (en) * | 2010-02-12 | 2013-09-12 | Kuo-Ching Chiang | Computing Device with Visual Image Browser |
US20110296340A1 (en) * | 2010-05-31 | 2011-12-01 | Denso Corporation | In-vehicle input system |
US9555707B2 (en) * | 2010-05-31 | 2017-01-31 | Denso Corporation | In-vehicle input system |
US20120154307A1 (en) * | 2010-12-21 | 2012-06-21 | Sony Corporation | Image display control apparatus and image display control method |
US9703403B2 (en) * | 2010-12-21 | 2017-07-11 | Sony Corporation | Image display control apparatus and image display control method |
CN103026328A (en) * | 2011-05-26 | 2013-04-03 | 松下电器产业株式会社 | Electronic device, and method for editing composite images |
US9221341B2 (en) * | 2011-09-26 | 2015-12-29 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation input apparatus and control method for vehicle operation input apparatus |
CN103019524A (en) * | 2011-09-26 | 2013-04-03 | 丰田自动车株式会社 | Vehicle operation input apparatus and control method for vehicle operation input apparatus |
US20130076499A1 (en) * | 2011-09-26 | 2013-03-28 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation input apparatus and control method for vehicle operation input apparatus |
CN103186283A (en) * | 2011-12-27 | 2013-07-03 | 爱信艾达株式会社 | Operation input device |
US20130265228A1 (en) * | 2012-04-05 | 2013-10-10 | Seiko Epson Corporation | Input device, display system and input method |
US9134814B2 (en) * | 2012-04-05 | 2015-09-15 | Seiko Epson Corporation | Input device, display system and input method |
US9540016B2 (en) * | 2014-09-26 | 2017-01-10 | Nissan North America, Inc. | Vehicle interface input receiving method |
CN105653755A (en) * | 2015-07-21 | 2016-06-08 | 上海趣驾信息科技有限公司 | Automobile navigation interface designing tool based on SGE graphic base |
Also Published As
Publication number | Publication date |
---|---|
JP4635957B2 (en) | 2011-02-23 |
US8164574B2 (en) | 2012-04-24 |
JP2007304953A (en) | 2007-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8164574B2 (en) | Touch panel input system for vehicle | |
US8085243B2 (en) | Input device and its method | |
JP4757091B2 (en) | Operation device for on-vehicle equipment | |
EP2010411B1 (en) | Operating device | |
US8570290B2 (en) | Image display device | |
EP2544078B1 (en) | Display device with adaptive capacitive touch panel | |
JP4677893B2 (en) | In-vehicle remote control device | |
JP4389855B2 (en) | Vehicle control device | |
JP4747810B2 (en) | In-vehicle remote control device | |
US8538090B2 (en) | Device for manipulating vehicle built-in devices | |
US20090002342A1 (en) | Information Processing Device | |
US20090195372A1 (en) | Apparatus for extracting operating object and apparatus for projecting operating hand | |
US20070262965A1 (en) | Input Device | |
JP2007302116A (en) | Operating device of on-vehicle equipment | |
JP2006091645A (en) | Display device | |
JP2006134184A (en) | Remote control switch | |
JP2006059238A (en) | Information input display device | |
JP5382313B2 (en) | Vehicle operation input device | |
JP2006264615A (en) | Display device for vehicle | |
JP2007286667A (en) | On-vehicle operation system | |
JP2008158675A (en) | Operation device for vehicle | |
JP2012018465A (en) | Operation input device and method of controlling the same | |
JP2007156950A (en) | Vehicle operating device | |
EP2988194B1 (en) | Operating device for vehicle | |
JP5240277B2 (en) | In-vehicle remote control device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DENSO CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MASUMOTO, YUUJI;SATO, YUJI;ITO, TOSHIYUKI;AND OTHERS;REEL/FRAME:019476/0160;SIGNING DATES FROM 20070417 TO 20070425 Owner name: DENSO CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MASUMOTO, YUUJI;SATO, YUJI;ITO, TOSHIYUKI;AND OTHERS;SIGNING DATES FROM 20070417 TO 20070425;REEL/FRAME:019476/0160 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20200424 |