US20040160458A1 - Speed dependent automatic zooming interface - Google Patents
Speed dependent automatic zooming interface Download PDFInfo
- Publication number
- US20040160458A1 US20040160458A1 US10/774,797 US77479704A US2004160458A1 US 20040160458 A1 US20040160458 A1 US 20040160458A1 US 77479704 A US77479704 A US 77479704A US 2004160458 A1 US2004160458 A1 US 2004160458A1
- Authority
- US
- United States
- Prior art keywords
- scale
- map
- speed
- display
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- This invention relates generally to interfaces for viewing content, and particularly to a speed-dependent automatic zooming interface for viewing content.
- a commonplace application for computers and computerized devices is the viewing of content such as web pages, graphic images, maps, word processing documents, etc.
- content such as web pages, graphic images, maps, word processing documents, etc.
- their size is too large to view them in their entirety on a display at full scale, even on relatively large displays.
- a document of more than one or two pages in length, or an image file having a width greater than 1,024 pixels and a height greater than 768 pixels is typically not completely viewable at full scale on a typical seventeen-inch monitor.
- Scroll bars allow a user to change what part of a document or image is currently viewable on the display.
- a vertical scroll bar allows a user to navigate a document or image in the vertical direction
- a horizontal scroll bar allows a user to navigate the document or image in the horizontal direction.
- a user thus is able to change what portion of the document or image is currently viewable on the display by scrolling through the document or image.
- One common input device, a mouse having a wheel is particularly well suited for scrolling through the document or image in one direction at a time, by using its wheel.
- the invention relates to speed-dependent automatic zooming through content such as documents and images.
- a method first receives an input, such as a user input on an input device like a mouse or a trackball.
- Other input devices amenable to an embodiment of the invention include self-centering input devices, such as self-centering joysticks, levers, etc.
- the input is mapped to either speed of navigation through a content space, or scale of the content space while being navigated.
- the other of speed or scale to which the input was not mapped is then determined, based on the relationship that scale times speed equals a constant.
- the content space is then navigated, based on the speed or scale mapped from the input, and the scale or speed determined.
- Embodiments of the invention provide for advantages not found within the prior art.
- a single input such as movement of a mouse, ultimately controls both speed of navigation through a content, and the scale of that content while being navigated. For example, while content is being navigated quickly, the scale of the content is reduced, so the user can still easily get a sense for where he or she is navigating within the content. Then, when the user slows down navigation, the scale of the content automatically is increased, so the user is able to easily particularly locate an exact desired point within the content. This is as compared with the prior art, which requires separate user inputs to control scale of the content while being navigated and speed of navigation through the content.
- perceptual benefits are provided for the user.
- the perceptual scrolling speed which is the visual speed of the document across the screen, becomes too fast to read the document, and the user can become disoriented.
- the perceptual scrolling speed remains constant, by controlling the zooming level based on the relationship that scale times speed equals a constant. Thus, the user does not become disoriented within the document.
- the invention includes computer-implemented methods, machine-readable media, computerized systems, and computers of varying scopes. Other aspects, embodiments and advantages of the invention, beyond those described here, will become apparent by reading the detailed description and with reference to the drawings.
- FIG. 1 is a diagram of an operating environment in conjunction with which embodiments of the invention can be practiced;
- FIGS. 2 - 4 are diagrams illustrating an example of navigation through content in accordance with an embodiment of the invention.
- FIG. 5 is a diagram of an indicator that can be used with an embodiment of the invention.
- FIG. 6 is a diagram of a graph showing the relationship between scale and speed as affected by input such as a change in position of an input device, such as a pointing device, according to one embodiment of the invention
- FIG. 7 is a flowchart of a method according to an embodiment of the invention.
- FIG. 8 is a diagram of a system according to an embodiment of the invention.
- FIG. 1 a diagram of the hardware and operating environment in conjunction with which embodiments of the invention may be practiced is shown.
- the description of FIG. 1 is intended to provide a brief, general description of suitable computer hardware and a suitable computing environment in conjunction with which the invention may be implemented.
- the invention is described in the general context of computer-executable instructions, such as program modules, being executed by a computer, such as a personal computer.
- program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
- the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PC's, minicomputers, mainframe computers, and the like.
- the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote memory storage devices.
- the exemplary hardware and operating environment of FIG. 1 for implementing the invention includes a general purpose computing device in the form of a computer 20 , including a processing unit 21 , a system memory 22 , and a system bus 23 that operatively couples various system components include the system memory to the processing unit 21 .
- a processing unit 21 There may be only one or there may be more than one processing unit 21 , such that the processor of computer 20 comprises a single central-processing unit (CPU), or a plurality of processing units, commonly referred to as a parallel processing environment.
- the computer 20 may be a conventional computer, a distributed computer, or any other type of computer; the invention is not so limited.
- the system bus 23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- the system memory may also be referred to as simply the memory, and includes read only memory (ROM) 24 and random access memory (RAM) 25 .
- ROM read only memory
- RAM random access memory
- a basic input/output system (BIOS) 26 containing the basic routines that help to transfer information between elements within the computer 20 , such as during start-up, is stored in ROM 24 .
- the computer 20 further includes a hard disk drive 27 for reading from and writing to a hard disk, not shown, a magnetic disk drive 28 for reading from or writing to a removable magnetic disk 29 , and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media.
- a hard disk drive 27 for reading from and writing to a hard disk, not shown
- a magnetic disk drive 28 for reading from or writing to a removable magnetic disk 29
- an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media.
- the hard disk drive 27 , magnetic disk drive 28 , and optical disk drive 30 are connected to the system bus 23 by a hard disk drive interface 32 , a magnetic disk drive interface 33 , and an optical disk drive interface 34 , respectively.
- the drives and their associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computer 20 . It should be appreciated by those skilled in the art that any type of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may be used in the exemplary operating environment.
- a number of program modules may be stored on the hard disk, magnetic disk 29 , optical disk 31 , ROM 24 , or RAM 25 , including an operating system 35 , one or more application programs 36 , other program modules 37 , and program data 38 .
- a user may enter commands and information into the personal computer 20 through input devices such as a keyboard 40 and pointing device 42 .
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are often connected to the processing unit 21 through a serial port interface 46 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port, or a universal serial bus (USB).
- a monitor 47 or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48 .
- computers typically include other peripheral output devices (not shown), such as speakers and printers.
- the computer 20 may operate in a networked environment using logical connections to one or more remote computers, such as remote computer 49 . These logical connections are achieved by a communication device coupled to or a part of the computer 20 ; the invention is not limited to a particular type of communications device.
- the remote computer 49 may be another computer, a server, a router, a network PC, a client, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 20 , although only a memory storage device 50 has been illustrated in FIG. 1.
- the logical connections depicted in FIG. 1 include a local-area network (LAN) 51 and a wide-area network (WAN) 52 .
- LAN local-area network
- WAN wide-area network
- Such networking environments are commonplace in office networks, enterprise-wide computer networks, intranets and the Internal, which are all types of networks.
- the computer 20 When used in a LAN-networking environment, the computer 20 is connected to the local network 51 through a network interface or adapter 53 , which is one type of communications device.
- the computer 20 When used in a WAN-networking environment, the computer 20 typically includes a modem 54 , a type of communications device, or any other type of communications device for establishing communications over the wide area network 52 , such as the Internet.
- the modem 54 which may be internal or external, is connected to the system bus 23 via the serial port interface 46 .
- program modules depicted relative to the personal computer 20 may be stored in the remote memory storage device. It is appreciated that the network connections shown are exemplary and other means of and communications devices for establishing a communications link between the computers may be used.
- a singular user input affects both speed of navigation through content, as well as the scale of the content while it is being navigated.
- Navigation as used herein means movement through the content as viewable on a display. For example, at full scale, only a portion of the content may be viewable on the display, whereas at 25% scale, the entire content may be viewable on the display, or at least more of the content than when it is being viewed at full scale, but with less visible detail.
- the speed of navigation refers to the speed at which a visible portion of the content is moved through on the display.
- the speed of navigation can be likened to scrolling speed.
- the scale of the content is indicative of how much of the document is viewable on the display while it is being navigated. At full scale, all the details of the content are typically visible on the display; however, only a small part of the document is viewable on the display. Reducing scale increases the portion of the document that is viewable on the display, but with a reduction in the visible detail. For example, at full scale, one paragraph of a text document may be viewable on the display, such that all the words of the paragraph are visible. At a lesser scale, one page of the text document may be viewable, but such that the words of the page are smaller, and thus more difficult to discern.
- all the pages of the text document may be viewable, but likely such that only chapter headings and other larger indicia of the text document are visible.
- Scale can in one embodiment be likened to resolution of the content; however, that term is not used universally herein so as not to create confusion with the resolution of the display itself, which typically remains constant for a given display.
- the invention is not limited to a particular type of content that can be navigated in accordance therewith.
- Typical types of content include word processing documents, including those containing images and other objects in addition to text, text documents, images in various formats, maps (which are a particular type of image), web pages, etc.
- Other typical types of documents include spreadsheets, drawings and illustrations, such as those utilizing vector graphics, as well as three-dimensional virtual spaces. That is, embodiments of the invention are applicable to any content that has one or more of a one-dimensional, two-dimensional, or three-dimensional spatial nature.
- Some types of documents may have built-in levels of abstraction that are viewable at different scales.
- a word processing document created using an outline feature of a word processing program may at its highest (full, or 100%) scale have all the words of the document displayed.
- chapter and section headings of the document may only be displayed.
- chapter headings may be displayed.
- a map may at full scale have all roads and small towns indicated.
- only larger cities and major roads may be indicated.
- Embodiments of the invention are amenable to content that have such built-in levels of abstraction, as well as content which does not. For content which does not, words of a text document may simply appear smaller at lower scales as compared to higher scales, while an image may lose detail as it is displayed at lower scales.
- FIG. 2 on a display 200 a text document is shown as being navigated within a window 202 .
- the document is shown at full resolution in FIG. 2.
- only a part of the text document is displayed within the window 202 ; in particular, only the phrase “Chapter 1” and some lines of text are viewable on the window 202 at one time.
- the user is scrolling slowly through the document. As the user increases the speed of navigation, the scale of the document is automatically reduced, so that more of the document is viewable. This is shown in FIG.
- the scale of content as it is being navigated is related to the speed at which the content is being navigated.
- the scale is higher, such that less of the document is viewable on the display, but at greater detail.
- the scale is lower, such that more of the document is viewable on the display, but at less detail. For example, a user may navigate very quickly through a text document to find the chapter in which a desired point of interest (a particular paragraph within that chapter, for example) is located.
- the scale automatically is reduced so that more than one chapter heading (“Chapter 1”, “Chapter 2”, etc.) is viewable on the display at one time—although the text itself of the chapters is likely difficult to discern, as a result of the lost of detail.
- the user controls either speed or the scale by providing an input, such that the other of the speed or the scale is determined by the relationship speed times scale equals a constant.
- the user may control either speed or scale via an input device, such as a mouse.
- an input device such as a mouse.
- the user presses down on a mouse button this indicates that the user wishes to activate the automatic zooming interface according to an embodiment of the invention.
- the scrolling speed of the document changes in accordance with how far the user has moved the mouse. Moving the mouse only a little, for example, corresponds to a slow scrolling speed; moving the mouse a lot corresponds to a fast scrolling speed. Moving the mouse forward as opposed to backward controls the direction of scrolling.
- the user has affected both speed and scale by his or her singular input.
- the user's movement of the mouse a particular distance directly controls the speed of navigation, from which the scale of the document while being navigated is dependently determined. It is noted, however, and as can be appreciated by those of ordinary skill within the art, that the specific use of a mouse pointing device as described herein is only an example of operation of an embodiment of the invention, and the invention itself is not limited to this example.
- an indicator appears on the display when the user has activated the automatic zooming interface.
- Such an indicator is shown in the diagram of FIG. 5.
- the height of the indicator 500 corresponds to the length of the content being navigated.
- the bar 502 within the indicator 500 indicates the relative position within the document that is currently being shown on the display.
- Using an indicator is not required, but is useful in that it provides visual indication when the automatic zooming interface has been activated, as well as to allow the user to see where the viewable part of the document is in relation to the document as a whole.
- an automatic zooming interface can be activated in accordance with an embodiment of the invention to provide for the inter-relation of scale and speed via a singular user input simultaneously in both the x and y directions.
- an input device such as a mouse or a joystick provides for movement across two dimensions, such an input device is well suited for use with embodiments of the invention providing for navigation of two-dimensional content such as images.
- embodiments of the invention can be applied to a variety of computers and computerized devices.
- a description of a computer has been provided in the previous section of the detailed description.
- Types of computers include laptop computers, desktop computers, and handheld computers, also known as personal digital assistants (PDA's).
- PDA's personal digital assistants
- Electronic book devices and other computerized devices are amenable to the invention.
- a navigation system within a car in which a map is shown on a display is amenable to the invention.
- the input provided can be the speed at which the driver is driving, which is typically under control of the driver by virtue of the force exerted on the gas pedal.
- Still other computers and computerized devices in addition to those referred to herein are also amenable to embodiments of the invention.
- the input directly controls one of the speed and the scale can be stated that one of the speed and the scale is a function of the input.
- Speed is navigation speed through the content
- scale is the zoom level, where 1 corresponds to the full-scale view, such that the smaller the scale becomes, the smaller the content is on the display (i.e., more of the content is viewable on the display).
- either speed or scale is linearly related to the change in position of the input device within a specific range of change in position, whereas in another embodiment, either speed or scale is exponentially related to the change in position of the input device within a specific range.
- FIG. 6 a diagram of a graph 600 showing the relationship among speed, scale, and the change in position of the input device, according to one specific embodiment, is shown.
- the solid line 602 corresponds to scale, while the dotted line 604 corresponds to speed.
- the range 606 indicates the range in which changes of position affects speed and scale, that is, between d0 and d1.
- v0 is the starting speed at the beginning of the range 606 .
- zoom level i.e., scale level
- the zoom level substantially changes when the user changes the navigation direction in order to go back to a position that has been passed. That is, in the process of moving the input device to the opposite side of the position where the speed-dependent automatic zooming interface was activated, dy gets closer to 0, causing a sudden zooming effect.
- a delayed zooming effect is utilized.
- Zoom level changes at most at a specific maximum rate. If the user moves the input device quickly, thereby requesting a sudden change of zoom (that is, scale) level, the zoom level changes with delay to achieve a smoother transition.
- this delay effect is applied only when the target scale as determined by dy is smaller than the current scale. This delay mechanism can also be referred to as a “controlled return” to the target scale.
- speed is set based on dy ⁇ a change in distance mapped from the input device, and scale is set as a constant divided by speed.
- scale is set to e dy (that is, the exponential constant e to the dy power), and speed is set to constant divided by scale, to cause a more user-pleasing automatic zooming effect.
- scale times speed equals a function of dy, that is, a function of movement of the input device.
- the invention is not limited to the function listed above, however. That is, more generally,
- the methods are computer-implemented.
- the computer-implemented methods can be realized at least in part as one or more programs running on a computer—that is, as a program executed from a computer-readable medium such as a memory by a processor of a computer, such as the computer shown in and described in conjunction with FIG. 1.
- the programs are desirably storable on a machine-readable medium such as a floppy disk or a CD-ROM, for distribution and installation and execution on another computer.
- an input is received.
- the input can be a user input, such as a user input as asserted via an input device, such as a pointing device.
- the input device can be any type of input device, such as a self-centering joystick, a mouse, a mouse wheel, a joystick, a trackball, a touchpad, and a pointstick, although the invention itself is not limited to the list of input devices recited herein.
- either speed of navigation through a content space—that is, through content—or scale of the content space while being navigated is mapped from the input. This can be in accordance with the graph of FIG. 6, as described in the previous section of the detailed description.
- the input is mapped to scale, either linearly or exponentially; in another embodiment, the input is mapped to speed, also either linearly or exponentially.
- the invention is not limited to a particular type of content space; that is, it is not limited to a particular type of content. Types of content include word processing documents, maps, text documents, images, and web pages, although the invention itself is not limited to the list of content space recited herein.
- the input is mapped to either scale or speed such that the scale or speed has a maximum rate of change in reflecting the input, as described in the previous section of the detailed description.
- the other of speed and scale to which the input was not mapped is determined based on the relationship speed times scale equals a constant. For example, where speed was mapped directly from the input in 702 , then in 704 the scale is determined based on this relationship. As another example, where scale was mapped directly from the input in 702 , then in 704 the speed is determined based on this relationship. Finally, in 706 , the content space is navigated, based on the speed and scale as mapped and determined in 702 and 704 .
- input devices that are amenable to embodiments of the invention include self-centering input devices, such as self-centering joysticks, and levers.
- self-centering input devices such as self-centering joysticks, and levers.
- the position, or angle, of the device is mapped to either speed or scaling, as has been described.
- speed can be made proportional to the force exerted on a self-centering isometric joystick, for example.
- Such self-centering input devices provide for certain advantages.
- the user can feel the current speed and/or scale, by touch, in that the device gives intuitive feedback, which may not be possible with a device such as a mouse.
- the user can stop scrolling and return to the original scale simply by releasing the device, such as the stick of the joystick.
- the physical set up for such devices is thus consistent with the behavior of embodiments of the invention.
- buttons buttons, keyboard keys, etc.
- pressing and holding down a first button can cause the speed to gradually increase, and the view to zoom out, while releasing the button causes scrolling to stop, and the zoom level to return to its original level.
- PDA personal-digital-assistant
- the system 800 includes an input device 802 , a display 804 , and a computer program 806 .
- the input device 802 such as a pointing device, has at least an output based on a singular user input.
- the output may be based on the user moving the input device 802 over a planar surface, which corresponds to a singular user input, and which is the case when the input device 802 is a mouse.
- the output may also be based on a user rotating the input device 802 on a fixed axis, which corresponds also to a singular user input, and which is the case when the input device 802 is a wheel of a mouse.
- There may be more than one output of the input device 802 such as is the case where a mouse can be moved over a planar surface (first output), has a rotatable wheel (second output), and two mouse buttons (third and fourth output).
- the system is concerned only with one output of the input device 802 , based on a singular user input.
- the display 804 can be a flat-panel display, for example, a cathode-ray tube, etc.; the invention is not so limited.
- the display 804 is such that navigation of a content space is shown thereon.
- the computer program 806 is designed to receive the singular user input as output from the input device 804 , and to show the navigation of the content space on the display 804 such that it has speed of navigation and scale while being navigated based only a user input comprising the singular user input. That is, both the speed and scale are based on the singular user input.
- the scale can be mapped from the user input to the scale of the content space while being navigated, and the speed of navigation through the content space can be determined based on the relationship that scale times speed equals a constant. In the sense that the scale is mapped from the singular user input, and the speed is determined from the scale, both the scale and the speed are based on the singular user input. That is, one input does not control only speed, while another input does not control only scale; rather, a single input affects both speed and scale.
- the user input is mapped to scale exponentially or linearly, while in another particular embodiment, the user input is mapped to the scale such that it has a maximum rate of change in reflecting the user input.
- speed of navigation is mapped from the user input, and the scale is determined from the speed based on the relationship that speed times scale equals a constant.
- the content space being navigated is not limited by the invention, it can include, for example, a word processing document, a map, a text document, an image, a web page, etc.
- the computer program 806 in one embodiment is executed from a computer-readable medium such as a memory or a hard disk drive by a processor.
- the program 806 corresponds to a means for receiving the singular user input, and for showing the navigation of the content space as having speed of navigation and scale while being navigated based only on the singular user input.
Abstract
Speed-dependent automatic zooming through content such as documents and images is disclosed. In one embodiment, a method first receives an input, such as a user input on an input device, like a joystick, a mouse, a trackball, or other pointing device. The input is mapped to either speed of navigation through a content space, or scale of the content space while being navigated. The other of speed or scale to which the input was not mapped is then determined, based on the relationship that scale times speed equals a constant. The content space is then navigated, based on the speed or scale mapped from the input, and the scale or speed determined.
Description
- This invention relates generally to interfaces for viewing content, and particularly to a speed-dependent automatic zooming interface for viewing content.
- A commonplace application for computers and computerized devices is the viewing of content such as web pages, graphic images, maps, word processing documents, etc. For most documents and images, their size is too large to view them in their entirety on a display at full scale, even on relatively large displays. For example, a document of more than one or two pages in length, or an image file having a width greater than 1,024 pixels and a height greater than 768 pixels, is typically not completely viewable at full scale on a typical seventeen-inch monitor.
- Most operating systems and application programs therefore have instituted scroll bar and zoom mechanisms for navigating such large documents and images. Scroll bars allow a user to change what part of a document or image is currently viewable on the display. For example, a vertical scroll bar allows a user to navigate a document or image in the vertical direction, while a horizontal scroll bar allows a user to navigate the document or image in the horizontal direction. A user thus is able to change what portion of the document or image is currently viewable on the display by scrolling through the document or image. One common input device, a mouse having a wheel, is particularly well suited for scrolling through the document or image in one direction at a time, by using its wheel.
- Using scroll bars to navigate a document or image does not change how much of the document or image is currently viewable on the display, though. The scale of the document remains constant. Therefore, in order to see more or less of a document or image—that is, to change the scale of the document or image—zooming is used. For example, an image may be “zoomed out” so that the entire image is viewable on a display at one time. The trade-off, however, is a loss of visible detail of the image, since the resolution of the display itself remains constant. Thus, at 100% scale, full detail of the image may be visible on the display, but only part of the image is typically viewable, whereas at 25% scale, the entire image may be viewable on the display, but with a loss of visible detail.
- Therefore, to quickly navigate a long document or a large image, a common approach is to first zoom out so that the entire document or image is viewable at reduced visible detail, locate the general part of the document or image that is of interest, zoom in on that part, and finally navigate that part to find the exact point of interest. This requires much user input: the user first has to manually invoke a zoom mechanism to zoom out, then perhaps use scroll bars to locate the general part of interest, again use the zoom mechanism to zoom in on this general part, and finally again use the scroll bars to find exactly the point of interest within the document or image.
- Many users may therefore simply opt to just stay at full scale, and quickly scroll through the document or image vertically and/or horizontally to locate the exact point of interest by trial and error. For example, within a word processing document, the user may know that the point of interest is located somewhere in the middle of the document. Therefore, the user may scroll down very quickly through the document, such that the document is not readable because of the speed at which the user is moving through it, and occasionally slow down or stop scrolling to determine if further scrolling in the same direction is needed, or if the desired point has in fact been overshot. If overshot, then the user must begin scrolling in the opposite direction. All this continues until the desired point of interest is finally reached.
- These and other prior art approaches for navigating a long document or large image thus leave much be desired. They do not allow easy and precise navigation through a long document or large image to quickly locate a desired point within the document or image. For this and other reasons, there is a need for the present invention.
- The invention relates to speed-dependent automatic zooming through content such as documents and images. In one embodiment, a method first receives an input, such as a user input on an input device like a mouse or a trackball. Other input devices amenable to an embodiment of the invention include self-centering input devices, such as self-centering joysticks, levers, etc. The input is mapped to either speed of navigation through a content space, or scale of the content space while being navigated. The other of speed or scale to which the input was not mapped is then determined, based on the relationship that scale times speed equals a constant. The content space is then navigated, based on the speed or scale mapped from the input, and the scale or speed determined.
- Embodiments of the invention provide for advantages not found within the prior art. In one embodiment, a single input, such as movement of a mouse, ultimately controls both speed of navigation through a content, and the scale of that content while being navigated. For example, while content is being navigated quickly, the scale of the content is reduced, so the user can still easily get a sense for where he or she is navigating within the content. Then, when the user slows down navigation, the scale of the content automatically is increased, so the user is able to easily particularly locate an exact desired point within the content. This is as compared with the prior art, which requires separate user inputs to control scale of the content while being navigated and speed of navigation through the content.
- Furthermore, it is noted that in one embodiment, perceptual benefits are provided for the user. For example, within the prior art, if the user scrolls too fast, the perceptual scrolling speed, which is the visual speed of the document across the screen, becomes too fast to read the document, and the user can become disoriented. However, in an embodiment of the invention, the perceptual scrolling speed remains constant, by controlling the zooming level based on the relationship that scale times speed equals a constant. Thus, the user does not become disoriented within the document.
- The invention includes computer-implemented methods, machine-readable media, computerized systems, and computers of varying scopes. Other aspects, embodiments and advantages of the invention, beyond those described here, will become apparent by reading the detailed description and with reference to the drawings.
- FIG. 1 is a diagram of an operating environment in conjunction with which embodiments of the invention can be practiced;
- FIGS.2-4 are diagrams illustrating an example of navigation through content in accordance with an embodiment of the invention;
- FIG. 5 is a diagram of an indicator that can be used with an embodiment of the invention;
- FIG. 6 is a diagram of a graph showing the relationship between scale and speed as affected by input such as a change in position of an input device, such as a pointing device, according to one embodiment of the invention;
- FIG. 7 is a flowchart of a method according to an embodiment of the invention; and,
- FIG. 8 is a diagram of a system according to an embodiment of the invention.
- In the following detailed description of exemplary embodiments of the invention, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the spirit or scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
- Some portions of the detailed descriptions which follow are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated.
- It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as processing or computing or calculating or determining or displaying or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- Referring to FIG. 1, a diagram of the hardware and operating environment in conjunction with which embodiments of the invention may be practiced is shown. The description of FIG. 1 is intended to provide a brief, general description of suitable computer hardware and a suitable computing environment in conjunction with which the invention may be implemented. Although not required, the invention is described in the general context of computer-executable instructions, such as program modules, being executed by a computer, such as a personal computer. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
- Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PC's, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
- The exemplary hardware and operating environment of FIG. 1 for implementing the invention includes a general purpose computing device in the form of a computer20, including a processing unit 21, a
system memory 22, and asystem bus 23 that operatively couples various system components include the system memory to the processing unit 21. There may be only one or there may be more than one processing unit 21, such that the processor of computer 20 comprises a single central-processing unit (CPU), or a plurality of processing units, commonly referred to as a parallel processing environment. The computer 20 may be a conventional computer, a distributed computer, or any other type of computer; the invention is not so limited. - The
system bus 23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory may also be referred to as simply the memory, and includes read only memory (ROM) 24 and random access memory (RAM) 25. A basic input/output system (BIOS) 26, containing the basic routines that help to transfer information between elements within the computer 20, such as during start-up, is stored inROM 24. The computer 20 further includes ahard disk drive 27 for reading from and writing to a hard disk, not shown, amagnetic disk drive 28 for reading from or writing to a removablemagnetic disk 29, and an optical disk drive 30 for reading from or writing to a removableoptical disk 31 such as a CD ROM or other optical media. - The
hard disk drive 27,magnetic disk drive 28, and optical disk drive 30 are connected to thesystem bus 23 by a harddisk drive interface 32, a magneticdisk drive interface 33, and an opticaldisk drive interface 34, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computer 20. It should be appreciated by those skilled in the art that any type of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may be used in the exemplary operating environment. - A number of program modules may be stored on the hard disk,
magnetic disk 29,optical disk 31,ROM 24, or RAM 25, including anoperating system 35, one ormore application programs 36,other program modules 37, andprogram data 38. A user may enter commands and information into the personal computer 20 through input devices such as akeyboard 40 andpointing device 42. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 21 through aserial port interface 46 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port, or a universal serial bus (USB). A monitor 47 or other type of display device is also connected to thesystem bus 23 via an interface, such as avideo adapter 48. In addition to the monitor, computers typically include other peripheral output devices (not shown), such as speakers and printers. - The computer20 may operate in a networked environment using logical connections to one or more remote computers, such as
remote computer 49. These logical connections are achieved by a communication device coupled to or a part of the computer 20; the invention is not limited to a particular type of communications device. Theremote computer 49 may be another computer, a server, a router, a network PC, a client, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 20, although only a memory storage device 50 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local-area network (LAN) 51 and a wide-area network (WAN) 52. Such networking environments are commonplace in office networks, enterprise-wide computer networks, intranets and the Internal, which are all types of networks. - When used in a LAN-networking environment, the computer20 is connected to the
local network 51 through a network interface oradapter 53, which is one type of communications device. When used in a WAN-networking environment, the computer 20 typically includes a modem 54, a type of communications device, or any other type of communications device for establishing communications over thewide area network 52, such as the Internet. The modem 54, which may be internal or external, is connected to thesystem bus 23 via theserial port interface 46. In a networked environment, program modules depicted relative to the personal computer 20, or portions thereof, may be stored in the remote memory storage device. It is appreciated that the network connections shown are exemplary and other means of and communications devices for establishing a communications link between the computers may be used. - In this section of the detailed description, an example of how an embodiment of the invention operates is presented. The invention itself, however, is not limited to this example, and it is presented herein for illustrative purposes only. Specific methods and systems according to varying embodiments of the invention are presented in proceeding sections to describe how an embodiment can achieve the operation described in this section.
- In one embodiment of the invention, a singular user input affects both speed of navigation through content, as well as the scale of the content while it is being navigated. Navigation as used herein means movement through the content as viewable on a display. For example, at full scale, only a portion of the content may be viewable on the display, whereas at 25% scale, the entire content may be viewable on the display, or at least more of the content than when it is being viewed at full scale, but with less visible detail.
- The speed of navigation refers to the speed at which a visible portion of the content is moved through on the display. In one embodiment, the speed of navigation can be likened to scrolling speed. When content is navigated through very quickly, it is difficult for a user to discern details of the content as they move on the display; conversely, when content is navigated through very slowly, it is easier for a user to discern content details, and much easier still when the content is not moving at all.
- The scale of the content is indicative of how much of the document is viewable on the display while it is being navigated. At full scale, all the details of the content are typically visible on the display; however, only a small part of the document is viewable on the display. Reducing scale increases the portion of the document that is viewable on the display, but with a reduction in the visible detail. For example, at full scale, one paragraph of a text document may be viewable on the display, such that all the words of the paragraph are visible. At a lesser scale, one page of the text document may be viewable, but such that the words of the page are smaller, and thus more difficult to discern. At a least scale, all the pages of the text document may be viewable, but likely such that only chapter headings and other larger indicia of the text document are visible. Scale can in one embodiment be likened to resolution of the content; however, that term is not used universally herein so as not to create confusion with the resolution of the display itself, which typically remains constant for a given display.
- The invention is not limited to a particular type of content that can be navigated in accordance therewith. Typical types of content include word processing documents, including those containing images and other objects in addition to text, text documents, images in various formats, maps (which are a particular type of image), web pages, etc. Other typical types of documents include spreadsheets, drawings and illustrations, such as those utilizing vector graphics, as well as three-dimensional virtual spaces. That is, embodiments of the invention are applicable to any content that has one or more of a one-dimensional, two-dimensional, or three-dimensional spatial nature.
- Some types of documents may have built-in levels of abstraction that are viewable at different scales. For example, a word processing document created using an outline feature of a word processing program may at its highest (full, or 100%) scale have all the words of the document displayed. At a lower scale, chapter and section headings of the document may only be displayed. At a lower scale still, only chapter headings may be displayed. As another example, a map may at full scale have all roads and small towns indicated. At a lower scale, only larger cities and major roads may be indicated. At the lowest scale, only boundaries between states and countries may be shown. Embodiments of the invention are amenable to content that have such built-in levels of abstraction, as well as content which does not. For content which does not, words of a text document may simply appear smaller at lower scales as compared to higher scales, while an image may lose detail as it is displayed at lower scales.
- Reference to the diagrams of FIGS.2-4 is now made. In FIG. 2, on a display 200 a text document is shown as being navigated within a
window 202. For purposes of this example, it is assumed that the document is shown at full resolution in FIG. 2. Thus, only a part of the text document is displayed within thewindow 202; in particular, only the phrase “Chapter 1” and some lines of text are viewable on thewindow 202 at one time. The user is scrolling slowly through the document. As the user increases the speed of navigation, the scale of the document is automatically reduced, so that more of the document is viewable. This is shown in FIG. 3, where within thewindow 202 on thedisplay 200 the scale of the text document has been reduced, such that both the phrases “Chapter 1” and “Chapter 2”, and more lines of text, are viewable. However, the size of the text has been decreased. When the user increased the speed of navigation even more, the scale of the document is even further automatically reduced. This is shown in FIG. 4, where within thewindow 202 on thedisplay 200 the scale of the text document has been reduced so that the three phrases “Chapter 1”, “Chapter 2” and “Chapter 3”, and still more lines of text, are viewable. The size of the text is even smaller, however, so that still more of the document is able to be viewed. That is, the detail of the document has decreased. - Thus, in accordance with an embodiment of the invention, the scale of content as it is being navigated is related to the speed at which the content is being navigated. When the content is being navigated slowly, the scale is higher, such that less of the document is viewable on the display, but at greater detail. When the content is being navigated quickly, the scale is lower, such that more of the document is viewable on the display, but at less detail. For example, a user may navigate very quickly through a text document to find the chapter in which a desired point of interest (a particular paragraph within that chapter, for example) is located. Therefore, the scale automatically is reduced so that more than one chapter heading (“
Chapter 1”, “Chapter 2”, etc.) is viewable on the display at one time—although the text itself of the chapters is likely difficult to discern, as a result of the lost of detail. Once the user has found the chapter in which the desired point of interest is located, he or she can decrease the speed of navigation, causing the scale to automatically be increased. This enables the user to easily discern the text of the desired chapter, to easily locate the desired point of interested therein. - In one embodiment, the user controls either speed or the scale by providing an input, such that the other of the speed or the scale is determined by the relationship speed times scale equals a constant. For example, the user may control either speed or scale via an input device, such as a mouse. When the user presses down on a mouse button, this indicates that the user wishes to activate the automatic zooming interface according to an embodiment of the invention. Thereafter, when the user moves the mouse forward or backward, the scrolling speed of the document changes in accordance with how far the user has moved the mouse. Moving the mouse only a little, for example, corresponds to a slow scrolling speed; moving the mouse a lot corresponds to a fast scrolling speed. Moving the mouse forward as opposed to backward controls the direction of scrolling. Ultimately, the user has affected both speed and scale by his or her singular input. In this example, the user's movement of the mouse a particular distance directly controls the speed of navigation, from which the scale of the document while being navigated is dependently determined. It is noted, however, and as can be appreciated by those of ordinary skill within the art, that the specific use of a mouse pointing device as described herein is only an example of operation of an embodiment of the invention, and the invention itself is not limited to this example.
- In one embodiment, an indicator appears on the display when the user has activated the automatic zooming interface. Such an indicator is shown in the diagram of FIG. 5. The height of the
indicator 500 corresponds to the length of the content being navigated. Thebar 502 within theindicator 500 indicates the relative position within the document that is currently being shown on the display. Using an indicator is not required, but is useful in that it provides visual indication when the automatic zooming interface has been activated, as well as to allow the user to see where the viewable part of the document is in relation to the document as a whole. - Furthermore, it is noted that while the example presented herein has been made with reference to one-dimensional content—text which can be navigated from beginning to end and vice-versa—the invention itself is not so limited. For example, in the context of two-dimensional content such as an image, an automatic zooming interface can be activated in accordance with an embodiment of the invention to provide for the inter-relation of scale and speed via a singular user input simultaneously in both the x and y directions. Because an input device such as a mouse or a joystick provides for movement across two dimensions, such an input device is well suited for use with embodiments of the invention providing for navigation of two-dimensional content such as images.
- It is also noted that embodiments of the invention can be applied to a variety of computers and computerized devices. A description of a computer has been provided in the previous section of the detailed description. Types of computers include laptop computers, desktop computers, and handheld computers, also known as personal digital assistants (PDA's). Electronic book devices and other computerized devices are amenable to the invention. A navigation system within a car in which a map is shown on a display is amenable to the invention. In such an embodiment, for example, the input provided can be the speed at which the driver is driving, which is typically under control of the driver by virtue of the force exerted on the gas pedal. Still other computers and computerized devices in addition to those referred to herein are also amenable to embodiments of the invention.
- In this section of the detailed description, a specific manner by which speed and scale are inter-related, and affected by an input, such as a user input provided on an input device, such as a pointing device, is described, according to one embodiment of the invention. The invention is not limited, however, to the specific manner described in this section. As has been described in the previous section, the input directly controls one of speed and scale, such that the other is determined based on the relationship that speed times scale equals a constant. This ensures that the perceptual navigation speed remains constant, regardless of the actual navigation speed within the content.
- That the input directly controls one of the speed and the scale can be stated that one of the speed and the scale is a function of the input. In one embodiment, the input is a change in position of an input device, such as a pointing device, upon activation of the speed-dependent automatic zooming interface, which can be stated as dy=[y coordinate of the current input device position]−[y coordinate of the position when the interface was activated]. Speed is navigation speed through the content, and scale is the zoom level, where 1 corresponds to the full-scale view, such that the smaller the scale becomes, the smaller the content is on the display (i.e., more of the content is viewable on the display). In one embodiment, either speed or scale is linearly related to the change in position of the input device within a specific range of change in position, whereas in another embodiment, either speed or scale is exponentially related to the change in position of the input device within a specific range.
- Referring now to FIG. 6, a diagram of a
graph 600 showing the relationship among speed, scale, and the change in position of the input device, according to one specific embodiment, is shown. Thesolid line 602 corresponds to scale, while the dottedline 604 corresponds to speed. Therange 606 indicates the range in which changes of position affects speed and scale, that is, between d0 and d1. In the embodiment of FIG. 6, scale is specifically directly controlled by the changed in position of the input device when in therange 606, according to the relationship -
- where s0 is the starting scale at the beginning of the range606 (i.e., when dy=d0), and v0 is the starting speed at the beginning of the
range 606. - Therefore, as can be seen from the
graph 600 of FIG. 6, once the speed-dependent automatic zooming interface is activated, and the user has caused a change in the position of the input device greater than d0, the scale of the content being navigated, identified by thesolid line 602, decreases, while the speed of navigation, identified by the dottedline 604, increases. As the user increases movement of the input device away from its initial position, such that its change in position continues to increase, the scale continues to decrease, and the speed continues to increase, until the change in position reaches d1, after which scale and speed remain constant. - It is noted that one drawback of the interface as described in conjunction with the graph of FIG. 6 is that the zoom level (i.e., scale level) substantially changes when the user changes the navigation direction in order to go back to a position that has been passed. That is, in the process of moving the input device to the opposite side of the position where the speed-dependent automatic zooming interface was activated, dy gets closer to 0, causing a sudden zooming effect.
- To prevent this situation, in one embodiment a delayed zooming effect is utilized. Zoom level changes at most at a specific maximum rate. If the user moves the input device quickly, thereby requesting a sudden change of zoom (that is, scale) level, the zoom level changes with delay to achieve a smoother transition. In one embodiment, this delay effect is applied only when the target scale as determined by dy is smaller than the current scale. This delay mechanism can also be referred to as a “controlled return” to the target scale.
-
- where sign is +1 if dy is non-negative (i.e., positive or zero), and is −1 if dy is negative. In this embodiment of the invention, scale times speed equals a function of dy, that is, a function of movement of the input device. The invention is not limited to the function listed above, however. That is, more generally,
- scale·speed =ƒ(dy)
- in some embodiments of the invention. It is noted that this reduces to the specific case already described, where scale times speed equals a constant, where the function of dy is a constant. That is, ƒ(dy)=k, where k is the constant.
- In this section of the detailed description, methods according to varying embodiments of the invention are described. The methods can be utilized to achieve a speed-dependent automatic zooming interface as has been described in a preceding section of the detailed description, in accordance with, for example, the graph described in the immediately previous section, in one embodiment. In some embodiments, the methods are computer-implemented. The computer-implemented methods can be realized at least in part as one or more programs running on a computer—that is, as a program executed from a computer-readable medium such as a memory by a processor of a computer, such as the computer shown in and described in conjunction with FIG. 1. The programs are desirably storable on a machine-readable medium such as a floppy disk or a CD-ROM, for distribution and installation and execution on another computer.
- Referring to FIG. 7, a flowchart of a method according to an embodiment of the invention is shown. In700, an input is received. For example, the input can be a user input, such as a user input as asserted via an input device, such as a pointing device. The input device can be any type of input device, such as a self-centering joystick, a mouse, a mouse wheel, a joystick, a trackball, a touchpad, and a pointstick, although the invention itself is not limited to the list of input devices recited herein.
- In702, either speed of navigation through a content space—that is, through content—or scale of the content space while being navigated is mapped from the input. This can be in accordance with the graph of FIG. 6, as described in the previous section of the detailed description. In one embodiment, the input is mapped to scale, either linearly or exponentially; in another embodiment, the input is mapped to speed, also either linearly or exponentially. The invention is not limited to a particular type of content space; that is, it is not limited to a particular type of content. Types of content include word processing documents, maps, text documents, images, and web pages, although the invention itself is not limited to the list of content space recited herein. In one embodiment, the input is mapped to either scale or speed such that the scale or speed has a maximum rate of change in reflecting the input, as described in the previous section of the detailed description.
- In704, the other of speed and scale to which the input was not mapped is determined based on the relationship speed times scale equals a constant. For example, where speed was mapped directly from the input in 702, then in 704 the scale is determined based on this relationship. As another example, where scale was mapped directly from the input in 702, then in 704 the speed is determined based on this relationship. Finally, in 706, the content space is navigated, based on the speed and scale as mapped and determined in 702 and 704.
- It is noted that other input devices that are amenable to embodiments of the invention include self-centering input devices, such as self-centering joysticks, and levers. For example, in such devices, the position, or angle, of the device is mapped to either speed or scaling, as has been described. When the user releases the device, or otherwise returns the device to its center or original position, the scrolling stops, and the zoom level returns to its original level. Furthermore, speed can be made proportional to the force exerted on a self-centering isometric joystick, for example.
- Such self-centering input devices provide for certain advantages. For example, the user can feel the current speed and/or scale, by touch, in that the device gives intuitive feedback, which may not be possible with a device such as a mouse. Furthermore, the user can stop scrolling and return to the original scale simply by releasing the device, such as the stick of the joystick. The physical set up for such devices is thus consistent with the behavior of embodiments of the invention.
- Other input devices that are amenable to embodiments of the invention include device buttons, keyboard keys, etc. For example, in such an embodiment, pressing and holding down a first button can cause the speed to gradually increase, and the view to zoom out, while releasing the button causes scrolling to stop, and the zoom level to return to its original level. Such an embodiment may be particularly useful for a personal-digital-assistant (PDA) or other hand-held device, for example.
- In this section of the detailed description, systems according to varying embodiments of the invention described. Referring to FIG. 8, a diagram of a system according to a specific embodiment is shown. The
system 800 includes aninput device 802, adisplay 804, and acomputer program 806. Theinput device 802, such as a pointing device, has at least an output based on a singular user input. For example, the output may be based on the user moving theinput device 802 over a planar surface, which corresponds to a singular user input, and which is the case when theinput device 802 is a mouse. The output may also be based on a user rotating theinput device 802 on a fixed axis, which corresponds also to a singular user input, and which is the case when theinput device 802 is a wheel of a mouse. There may be more than one output of theinput device 802, such as is the case where a mouse can be moved over a planar surface (first output), has a rotatable wheel (second output), and two mouse buttons (third and fourth output). As described in accordance with FIG. 8, the system is concerned only with one output of theinput device 802, based on a singular user input. Thedisplay 804 can be a flat-panel display, for example, a cathode-ray tube, etc.; the invention is not so limited. Thedisplay 804 is such that navigation of a content space is shown thereon. - The
computer program 806 is designed to receive the singular user input as output from theinput device 804, and to show the navigation of the content space on thedisplay 804 such that it has speed of navigation and scale while being navigated based only a user input comprising the singular user input. That is, both the speed and scale are based on the singular user input. For example, in one specific embodiment, the scale can be mapped from the user input to the scale of the content space while being navigated, and the speed of navigation through the content space can be determined based on the relationship that scale times speed equals a constant. In the sense that the scale is mapped from the singular user input, and the speed is determined from the scale, both the scale and the speed are based on the singular user input. That is, one input does not control only speed, while another input does not control only scale; rather, a single input affects both speed and scale. - In one particular embodiment, the user input is mapped to scale exponentially or linearly, while in another particular embodiment, the user input is mapped to the scale such that it has a maximum rate of change in reflecting the user input. In another embodiment, speed of navigation is mapped from the user input, and the scale is determined from the speed based on the relationship that speed times scale equals a constant. The content space being navigated is not limited by the invention, it can include, for example, a word processing document, a map, a text document, an image, a web page, etc. The
computer program 806 in one embodiment is executed from a computer-readable medium such as a memory or a hard disk drive by a processor. In one particular embodiment, theprogram 806 corresponds to a means for receiving the singular user input, and for showing the navigation of the content space as having speed of navigation and scale while being navigated based only on the singular user input. - Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is manifestly intended that this invention be limited only by the claims and equivalents thereof.
Claims (26)
1. A navigation system comprising:
a display for displaying an area of a map;
a component that receives speed information relating to movement of a vehicle; and
a navigation component that modifies a scale of the map display area as a function of the speed information.
2. The system of claim 1 , the display is a graphical user interface within the vehicle.
3. The system of claim 1 , the speed information is based at least in part on force exerted on an accelerator.
4. The system of claim 1 , the speed information is based at least in part on speedometer information.
5. The system of claim 1 , the speed information is based at least in part on odometer information comprising distance traveled over a period of time.
6. The system of claim 1 , the scale of the map display area is inversely proportional to the speed of the vehicle.
7. The system of claim 6 , the product of the speed of the vehicle and the scale of the map display area are equal to a constant.
8. The system of claim 1 , the navigation component modifies the scale of the map display area as an exponential function of the speed information.
9. The system of claim 1 , the navigation component modifies the scale of the map display area as a linear function of the speed information.
10. The system of claim 1 , the rate at which the scale of the map display area is modified is a function of a rate of change of the speed information.
11. The system of claim 1 , the map is at least one of a road map, a topographical map, and an aerial map.
12. A method for automatically zooming a map area display comprising:
displaying a map area to a user in a vehicle;
selectively indicating position of the vehicle on the map area display;
determining speed information related to movement of the vehicle; and
modifying scale of the map area display as a function of the speed information of the vehicle.
13. The method of claim 12 , further comprising modifying the scale of the map area display as a function of intervals of speeds of the vehicle.
14. The method of claim 13 , the scale of the map area display is modified when the speed of the vehicle crosses an interval boundary.
15. The method of claim 12 , further comprising modifying the scale of the map area display at a rate that is dependent on the rate of change of the speed information.
16. The method of claim 15 , the rate at which the scale of the map area display is modified has a maximum limit.
17. The method of claim 12 , further comprising determining a base scale at which to display the map area.
18. The method of claim 17 , further comprising increasing or decreasing the scale of the map display area from the base scale as a function of the speed information of the vehicle.
19. The method of claim 12 , further comprising positioning the vehicle at the center of the map display area while displaying the map area to the user in the vehicle.
20. The method of claim 12 , the scale of the map area display is equal to a constant divided by the speed of the vehicle.
21. The method of claim 20 , the scale of the map area display and the speed of the vehicle are linearly related.
22. The method of claim 20 , the scale of the map area display and the speed of the vehicle are exponentially related.
23. The method of claim 12 , further comprising selectively modifying the scale of the map area display as a function of the complexity of the map, the scale of the map area display is directly proportional to the complexity of the map
24. A method for automatically zooming map area display scale, comprising;
means for displaying a map to a user in a vehicle;
means for determining speed information related to the vehicle; and
means for adjusting scale of the map based at least in part on speed information related to the vehicle.
25. The method of claim 24 , further comprising means for selectively indicating the position the vehicle on the map area display.
26. The method of claim 24 , the product of the scale of the map area display and the speed of the vehicle equals a constant.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/774,797 US20040160458A1 (en) | 1999-12-13 | 2004-02-09 | Speed dependent automatic zooming interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/460,028 US6747680B1 (en) | 1999-12-13 | 1999-12-13 | Speed-dependent automatic zooming interface |
US10/774,797 US20040160458A1 (en) | 1999-12-13 | 2004-02-09 | Speed dependent automatic zooming interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/460,028 Division US6747680B1 (en) | 1997-06-18 | 1999-12-13 | Speed-dependent automatic zooming interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040160458A1 true US20040160458A1 (en) | 2004-08-19 |
Family
ID=32326719
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/460,028 Expired - Lifetime US6747680B1 (en) | 1997-06-18 | 1999-12-13 | Speed-dependent automatic zooming interface |
US10/774,797 Abandoned US20040160458A1 (en) | 1999-12-13 | 2004-02-09 | Speed dependent automatic zooming interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/460,028 Expired - Lifetime US6747680B1 (en) | 1997-06-18 | 1999-12-13 | Speed-dependent automatic zooming interface |
Country Status (1)
Country | Link |
---|---|
US (2) | US6747680B1 (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010032221A1 (en) * | 2000-04-14 | 2001-10-18 | Majid Anwar | Systems and methods for generating visual representations of graphical data and digital document processing |
US20040219980A1 (en) * | 2003-04-30 | 2004-11-04 | Nintendo Co., Ltd. | Method and apparatus for dynamically controlling camera parameters based on game play events |
US20060271870A1 (en) * | 2005-05-31 | 2006-11-30 | Picsel Research Limited | Systems and methods for navigating displayed content |
US20060281471A1 (en) * | 2005-06-08 | 2006-12-14 | Cisco Technology,Inc. | Method and system for communicating using position information |
US20070036118A1 (en) * | 2005-08-10 | 2007-02-15 | Cisco Technology, Inc. | Method and system for automatic configuration of virtual talk groups based on location of media sources |
US20070036100A1 (en) * | 2005-08-10 | 2007-02-15 | Cisco Technology, Inc. | Method and system for communicating media based on location of media source |
US20070047479A1 (en) * | 2005-08-29 | 2007-03-01 | Cisco Technology, Inc. | Method and system for conveying media source location information |
US20070052732A1 (en) * | 2005-08-01 | 2007-03-08 | Microsoft Corporation | Resolution independent image resource |
US20070202908A1 (en) * | 2006-02-28 | 2007-08-30 | Cisco Technology, Inc. | Method and system for providing interoperable communications with dynamic event area allocation |
US20080155475A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Scrolling interface |
US20080155474A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Scrolling interface |
US20080150892A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Collection browser for image items with multi-valued attributes |
WO2010026044A1 (en) * | 2008-09-03 | 2010-03-11 | Volkswagen Ag | Method and device for displaying information, in particular in a vehicle |
US20100123734A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image processing method, and image display program |
US20100125786A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image display method, and image display program |
US20100328351A1 (en) * | 2009-06-29 | 2010-12-30 | Razer (Asia-Pacific) Pte Ltd | User interface |
US20110102455A1 (en) * | 2009-11-05 | 2011-05-05 | Will John Temple | Scrolling and zooming of a portable device display with device motion |
US20110119578A1 (en) * | 2009-11-17 | 2011-05-19 | Schwartz Michael U | Method of scrolling items on a touch screen user interface |
US20110134126A1 (en) * | 2009-05-12 | 2011-06-09 | Reiko Miyazaki | Information processing device, information processing method, and information processing program |
US20120110501A1 (en) * | 2010-11-03 | 2012-05-03 | Samsung Electronics Co. Ltd. | Mobile terminal and screen change control method based on input signals for the same |
KR20120047195A (en) * | 2010-11-03 | 2012-05-11 | 삼성전자주식회사 | Controlling method for changing screen based on a input signal and portable device supporting the same |
US20120127107A1 (en) * | 2009-07-28 | 2012-05-24 | Ken Miyashita | Display control device, display control method, and computer program |
US20130019200A1 (en) * | 2005-01-31 | 2013-01-17 | Roland Wescott Montague | Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag |
US8397180B2 (en) | 2006-12-21 | 2013-03-12 | Canon Kabushiki Kaisha | Scrolling browser with previewing area |
JP2013097426A (en) * | 2011-10-28 | 2013-05-20 | Nintendo Co Ltd | Information processing program, information processing device, information processing system, and information processing method |
JP2014194773A (en) * | 2013-03-28 | 2014-10-09 | Samsung Electronics Co Ltd | Display method for display apparatus, and display apparatus |
US9360993B2 (en) | 2002-03-19 | 2016-06-07 | Facebook, Inc. | Display navigation |
US9619132B2 (en) | 2007-01-07 | 2017-04-11 | Apple Inc. | Device, method and graphical user interface for zooming in on a touch-screen display |
US9778836B2 (en) | 2000-04-14 | 2017-10-03 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
DE102016212139A1 (en) * | 2016-07-04 | 2018-01-04 | Bayerische Motoren Werke Aktiengesellschaft | Method for displaying data elements, operating device for a vehicle, and vehicle comprising the operating device |
US11244427B2 (en) * | 2018-04-27 | 2022-02-08 | Tencent Technology (Shenzhen) Company Ltd | Image resolution processing method, system, and apparatus, storage medium, and device |
Families Citing this family (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6738045B2 (en) * | 2001-02-26 | 2004-05-18 | Microsoft Corporation | Method and system for accelerated data navigation |
US6989815B2 (en) * | 2001-09-13 | 2006-01-24 | E-Book Systems Pte Ltd. | Method for flipping pages via electromechanical information browsing device |
KR100608735B1 (en) * | 2002-07-09 | 2006-08-04 | 엘지전자 주식회사 | Picture display method for mobile communication device |
US8015259B2 (en) * | 2002-09-10 | 2011-09-06 | Alan Earl Swahn | Multi-window internet search with webpage preload |
JP4111834B2 (en) * | 2003-01-07 | 2008-07-02 | 株式会社ソニー・コンピュータエンタテインメント | Image generation method and apparatus |
EP1586084A2 (en) * | 2003-01-21 | 2005-10-19 | E-Book Systems Pte. Ltd. | A programmable virtual book system |
US20040243307A1 (en) | 2003-06-02 | 2004-12-02 | Pieter Geelen | Personal GPS navigation device |
JP2005098732A (en) * | 2003-09-22 | 2005-04-14 | Alpine Electronics Inc | Navigation system and map-display method |
US7327349B2 (en) * | 2004-03-02 | 2008-02-05 | Microsoft Corporation | Advanced navigation techniques for portable devices |
US8418075B2 (en) | 2004-11-16 | 2013-04-09 | Open Text Inc. | Spatially driven content presentation in a cellular environment |
US8001476B2 (en) | 2004-11-16 | 2011-08-16 | Open Text Inc. | Cellular user interface |
JP4356594B2 (en) * | 2004-11-22 | 2009-11-04 | ソニー株式会社 | Display device, display method, display program, and recording medium on which display program is recorded |
JP4839603B2 (en) * | 2004-11-22 | 2011-12-21 | ソニー株式会社 | Display device, display method, display program, and recording medium on which display program is recorded |
JP4653561B2 (en) * | 2005-05-31 | 2011-03-16 | 株式会社東芝 | Information processing apparatus and display control method |
CN100356370C (en) * | 2005-12-15 | 2007-12-19 | 无锡永中科技有限公司 | Processing method of enhancing opening speed of word processing file |
US7761804B2 (en) * | 2006-02-01 | 2010-07-20 | Ricoh Company, Ltd. | Avoiding disorientation under discontinuous navigation in an image flipping system |
JP5129478B2 (en) * | 2006-03-24 | 2013-01-30 | 株式会社デンソーアイティーラボラトリ | Screen display device |
CN101042300B (en) * | 2006-03-24 | 2014-06-25 | 株式会社电装 | Image display apparatus |
WO2007129247A1 (en) * | 2006-05-08 | 2007-11-15 | Koninklijke Philips Electronics N.V. | Method and device for displaying visual representations of a plurality of items |
KR101406289B1 (en) * | 2007-03-08 | 2014-06-12 | 삼성전자주식회사 | Apparatus and method for providing items based on scrolling |
US7768536B2 (en) * | 2007-04-11 | 2010-08-03 | Sony Ericsson Mobile Communications Ab | Methods of displaying information at different zoom settings and related devices and computer program products |
US7810044B2 (en) * | 2007-04-30 | 2010-10-05 | Hewlett-Packard Development Company, L.P. | Electronic device display adjustment interface |
US20090015568A1 (en) * | 2007-07-12 | 2009-01-15 | Koski David A | Method and Apparatus for Implementing Slider Detents |
US8462112B2 (en) * | 2007-07-12 | 2013-06-11 | Apple Inc. | Responsiveness control system for pointing device movement with respect to a graphical user interface |
US20090015557A1 (en) * | 2007-07-12 | 2009-01-15 | Koski David A | Responsiveness Control Method for Pointing Device Movement With Respect to a Graphical User Interface |
US20090037840A1 (en) * | 2007-08-03 | 2009-02-05 | Siemens Medical Solutions Usa, Inc. | Location Determination For Z-Direction Increments While Viewing Medical Images |
US10134044B1 (en) | 2008-05-28 | 2018-11-20 | Excalibur Ip, Llc | Collection and use of fine-grained user behavior data |
DE102009019563A1 (en) | 2009-04-30 | 2010-11-04 | Volkswagen Ag | Method and device for displaying list-ordered information |
CH701440A2 (en) * | 2009-07-03 | 2011-01-14 | Comme Le Temps Sa | Wrist touch screen and method for displaying on a watch with touch screen. |
US9035887B1 (en) | 2009-07-10 | 2015-05-19 | Lexcycle, Inc | Interactive user interface |
US8347232B1 (en) | 2009-07-10 | 2013-01-01 | Lexcycle, Inc | Interactive user interface |
US8817052B2 (en) * | 2009-11-02 | 2014-08-26 | Sony Corporation | Information processing apparatus, image enlargement processing method, and computer program product with visible data area enlargement features |
US20110214088A1 (en) * | 2010-02-26 | 2011-09-01 | Research In Motion Limited | Automatic scrolling of electronic messages |
US8683377B2 (en) * | 2010-05-12 | 2014-03-25 | Adobe Systems Incorporated | Method for dynamically modifying zoom level to facilitate navigation on a graphical user interface |
JP2012093860A (en) | 2010-10-25 | 2012-05-17 | Aisin Aw Co Ltd | Display device, display method and display program |
JP2012093887A (en) * | 2010-10-26 | 2012-05-17 | Aisin Aw Co Ltd | Display device, display method and display program |
US8826191B1 (en) * | 2011-01-05 | 2014-09-02 | Google Inc. | Zooming while page turning in document |
JP5667469B2 (en) * | 2011-02-24 | 2015-02-12 | 京セラ株式会社 | Electronic device, display control method, and display control program |
US9347791B2 (en) * | 2011-10-07 | 2016-05-24 | The Boeing Company | Methods and systems for operating a touch screen display |
JP5994412B2 (en) * | 2012-06-13 | 2016-09-21 | 富士ゼロックス株式会社 | Image display apparatus, image control apparatus, image forming apparatus, and program |
WO2014058144A1 (en) | 2012-10-10 | 2014-04-17 | 에스케이플래닛 주식회사 | Method and system for displaying fast-scrolling content and scroll bar |
TWI467467B (en) * | 2012-10-29 | 2015-01-01 | Pixart Imaging Inc | Method and apparatus for controlling object movement on screen |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US10423214B2 (en) | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US10474342B2 (en) * | 2012-12-17 | 2019-11-12 | Microsoft Technology Licensing, Llc | Scrollable user interface control |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
JP6125467B2 (en) * | 2014-06-16 | 2017-05-10 | 富士フイルム株式会社 | Print order receiving machine, its operating method and operating program |
KR102240640B1 (en) * | 2014-07-03 | 2021-04-15 | 엘지전자 주식회사 | Display apparatus and method of controlling the same |
US10042532B2 (en) * | 2015-05-05 | 2018-08-07 | Facebook, Inc. | Methods and systems for viewing embedded content |
US10685471B2 (en) | 2015-05-11 | 2020-06-16 | Facebook, Inc. | Methods and systems for playing video while transitioning from a content-item preview to the content item |
DE102019202592A1 (en) * | 2019-02-26 | 2020-08-27 | Volkswagen Aktiengesellschaft | Method for operating a driver information system in an ego vehicle and driver information system |
Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3618240A (en) * | 1967-09-05 | 1971-11-09 | Charles Pelin | Route map and feature display device for moving vehicles |
US5189430A (en) * | 1989-10-24 | 1993-02-23 | Mitsubishi Denki Kabushiki Kaisha | Navigation system for movable body |
US5732385A (en) * | 1994-04-15 | 1998-03-24 | Nissan Motor Co., Ltd. | Vehicle navigation system displaying bird-eye view of different visual points and different contraction scale ratios depending upon vehicle travel conditions |
US5864305A (en) * | 1994-03-04 | 1999-01-26 | Ab Volvo | Traffic information system |
US5874943A (en) * | 1993-03-24 | 1999-02-23 | International Business Machines Corporation | Feedback of object size during direct manipulation |
US5884218A (en) * | 1995-09-29 | 1999-03-16 | Aisin Aw Co., Ltd. | Map indication device and navigation device |
US5897604A (en) * | 1995-12-26 | 1999-04-27 | Nissan Motor Co., Ltd. | Apparatus and method for navigating mobile body using bird's eye view on display screen |
US5948040A (en) * | 1994-06-24 | 1999-09-07 | Delorme Publishing Co. | Travel reservation information and planning system |
US6014142A (en) * | 1995-11-13 | 2000-01-11 | Platinum Technology Ip, Inc. | Apparatus and method for three dimensional manipulation of point of view and object |
US6032098A (en) * | 1995-04-17 | 2000-02-29 | Honda Giken Kogyo Kabushiki Kaisha | Automatic travel guiding device for vehicle |
US6064941A (en) * | 1996-09-30 | 2000-05-16 | Aisin Aw Co., Ltd. | Vehicle navigation apparatus and storage medium |
US6067502A (en) * | 1996-08-21 | 2000-05-23 | Aisin Aw Co., Ltd. | Device for displaying map |
US6125323A (en) * | 1996-04-28 | 2000-09-26 | Aisin Aw Co., Ltd. | Device for processing road data or intersection data |
US6154205A (en) * | 1998-03-25 | 2000-11-28 | Microsoft Corporation | Navigating web-based content in a television-based system |
US6157342A (en) * | 1997-05-27 | 2000-12-05 | Xanavi Informatics Corporation | Navigation device |
US6163752A (en) * | 1998-03-05 | 2000-12-19 | Volkswagen Ag | Method and arrangement for representing data in vehicle navigation systems |
US20010002817A1 (en) * | 1999-12-07 | 2001-06-07 | Raymond Berlioz | Indicator of a variable for aircraft |
US6279906B1 (en) * | 1997-06-18 | 2001-08-28 | Act Labs, Ltd. | Video game controller system with interchangeable interface adapters |
US6323878B1 (en) * | 1999-03-03 | 2001-11-27 | Sony Corporation | System and method for providing zooming video capture |
US6326970B1 (en) * | 1997-05-16 | 2001-12-04 | Liberate Technologies | TV centric layout |
US6333752B1 (en) * | 1998-03-13 | 2001-12-25 | Ricoh Company, Ltd. | Image processing apparatus, image processing method, and a computer-readable storage medium containing a computer program for image processing recorded thereon |
US6339434B1 (en) * | 1997-11-24 | 2002-01-15 | Pixelworks | Image scaling circuit for fixed pixed resolution display |
US6632138B1 (en) * | 1996-10-09 | 2003-10-14 | Kabushiki Kaisha Sega Enterprises | Game apparatus, game processing method, game execution method, and game system |
US20040140951A1 (en) * | 2003-01-17 | 2004-07-22 | Blish Jacob Adam | Foot operated computer mouse |
US20040153233A1 (en) * | 1997-04-25 | 2004-08-05 | Hitachi, Ltd. | Automotive control apparatus and method |
-
1999
- 1999-12-13 US US09/460,028 patent/US6747680B1/en not_active Expired - Lifetime
-
2004
- 2004-02-09 US US10/774,797 patent/US20040160458A1/en not_active Abandoned
Patent Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3618240A (en) * | 1967-09-05 | 1971-11-09 | Charles Pelin | Route map and feature display device for moving vehicles |
US5189430A (en) * | 1989-10-24 | 1993-02-23 | Mitsubishi Denki Kabushiki Kaisha | Navigation system for movable body |
US5874943A (en) * | 1993-03-24 | 1999-02-23 | International Business Machines Corporation | Feedback of object size during direct manipulation |
US5864305A (en) * | 1994-03-04 | 1999-01-26 | Ab Volvo | Traffic information system |
US5732385A (en) * | 1994-04-15 | 1998-03-24 | Nissan Motor Co., Ltd. | Vehicle navigation system displaying bird-eye view of different visual points and different contraction scale ratios depending upon vehicle travel conditions |
US5948040A (en) * | 1994-06-24 | 1999-09-07 | Delorme Publishing Co. | Travel reservation information and planning system |
US6032098A (en) * | 1995-04-17 | 2000-02-29 | Honda Giken Kogyo Kabushiki Kaisha | Automatic travel guiding device for vehicle |
US5884218A (en) * | 1995-09-29 | 1999-03-16 | Aisin Aw Co., Ltd. | Map indication device and navigation device |
US6014142A (en) * | 1995-11-13 | 2000-01-11 | Platinum Technology Ip, Inc. | Apparatus and method for three dimensional manipulation of point of view and object |
US5897604A (en) * | 1995-12-26 | 1999-04-27 | Nissan Motor Co., Ltd. | Apparatus and method for navigating mobile body using bird's eye view on display screen |
US6125323A (en) * | 1996-04-28 | 2000-09-26 | Aisin Aw Co., Ltd. | Device for processing road data or intersection data |
US6067502A (en) * | 1996-08-21 | 2000-05-23 | Aisin Aw Co., Ltd. | Device for displaying map |
US6064941A (en) * | 1996-09-30 | 2000-05-16 | Aisin Aw Co., Ltd. | Vehicle navigation apparatus and storage medium |
US6632138B1 (en) * | 1996-10-09 | 2003-10-14 | Kabushiki Kaisha Sega Enterprises | Game apparatus, game processing method, game execution method, and game system |
US20040153233A1 (en) * | 1997-04-25 | 2004-08-05 | Hitachi, Ltd. | Automotive control apparatus and method |
US6326970B1 (en) * | 1997-05-16 | 2001-12-04 | Liberate Technologies | TV centric layout |
US6157342A (en) * | 1997-05-27 | 2000-12-05 | Xanavi Informatics Corporation | Navigation device |
US6279906B1 (en) * | 1997-06-18 | 2001-08-28 | Act Labs, Ltd. | Video game controller system with interchangeable interface adapters |
US6339434B1 (en) * | 1997-11-24 | 2002-01-15 | Pixelworks | Image scaling circuit for fixed pixed resolution display |
US6163752A (en) * | 1998-03-05 | 2000-12-19 | Volkswagen Ag | Method and arrangement for representing data in vehicle navigation systems |
US6333752B1 (en) * | 1998-03-13 | 2001-12-25 | Ricoh Company, Ltd. | Image processing apparatus, image processing method, and a computer-readable storage medium containing a computer program for image processing recorded thereon |
US6154205A (en) * | 1998-03-25 | 2000-11-28 | Microsoft Corporation | Navigating web-based content in a television-based system |
US6323878B1 (en) * | 1999-03-03 | 2001-11-27 | Sony Corporation | System and method for providing zooming video capture |
US20010002817A1 (en) * | 1999-12-07 | 2001-06-07 | Raymond Berlioz | Indicator of a variable for aircraft |
US20040140951A1 (en) * | 2003-01-17 | 2004-07-22 | Blish Jacob Adam | Foot operated computer mouse |
Cited By (72)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090063960A1 (en) * | 2000-04-14 | 2009-03-05 | Picsel (Research) Ltd | User interface systems and methods for manipulating and viewing digital documents |
US8358290B2 (en) | 2000-04-14 | 2013-01-22 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US7009626B2 (en) | 2000-04-14 | 2006-03-07 | Picsel Technologies Limited | Systems and methods for generating visual representations of graphical data and digital document processing |
US8593436B2 (en) | 2000-04-14 | 2013-11-26 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US20010032221A1 (en) * | 2000-04-14 | 2001-10-18 | Majid Anwar | Systems and methods for generating visual representations of graphical data and digital document processing |
US20100192062A1 (en) * | 2000-04-14 | 2010-07-29 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US20100185948A1 (en) * | 2000-04-14 | 2010-07-22 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US20100185975A1 (en) * | 2000-04-14 | 2010-07-22 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US9778836B2 (en) | 2000-04-14 | 2017-10-03 | Samsung Electronics Co., Ltd. | User interface systems and methods for manipulating and viewing digital documents |
US9626073B2 (en) | 2002-03-19 | 2017-04-18 | Facebook, Inc. | Display navigation |
US9753606B2 (en) | 2002-03-19 | 2017-09-05 | Facebook, Inc. | Animated display navigation |
US9886163B2 (en) | 2002-03-19 | 2018-02-06 | Facebook, Inc. | Constrained display navigation |
US9851864B2 (en) | 2002-03-19 | 2017-12-26 | Facebook, Inc. | Constraining display in display navigation |
US10365785B2 (en) | 2002-03-19 | 2019-07-30 | Facebook, Inc. | Constraining display motion in display navigation |
US9360993B2 (en) | 2002-03-19 | 2016-06-07 | Facebook, Inc. | Display navigation |
US10055090B2 (en) | 2002-03-19 | 2018-08-21 | Facebook, Inc. | Constraining display motion in display navigation |
US9678621B2 (en) | 2002-03-19 | 2017-06-13 | Facebook, Inc. | Constraining display motion in display navigation |
US20040219980A1 (en) * | 2003-04-30 | 2004-11-04 | Nintendo Co., Ltd. | Method and apparatus for dynamically controlling camera parameters based on game play events |
US20130019200A1 (en) * | 2005-01-31 | 2013-01-17 | Roland Wescott Montague | Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag |
US9176653B2 (en) * | 2005-01-31 | 2015-11-03 | Roland Wescott Montague | Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag |
US20060271870A1 (en) * | 2005-05-31 | 2006-11-30 | Picsel Research Limited | Systems and methods for navigating displayed content |
US20060281471A1 (en) * | 2005-06-08 | 2006-12-14 | Cisco Technology,Inc. | Method and system for communicating using position information |
US8045998B2 (en) | 2005-06-08 | 2011-10-25 | Cisco Technology, Inc. | Method and system for communicating using position information |
US20070052732A1 (en) * | 2005-08-01 | 2007-03-08 | Microsoft Corporation | Resolution independent image resource |
US7626595B2 (en) * | 2005-08-01 | 2009-12-01 | Microsoft Corporation | Resolution independent image resource |
US7706339B2 (en) | 2005-08-10 | 2010-04-27 | Cisco Technology, Inc. | Method and system for communicating media based on location of media source |
US20070036100A1 (en) * | 2005-08-10 | 2007-02-15 | Cisco Technology, Inc. | Method and system for communicating media based on location of media source |
US20070036118A1 (en) * | 2005-08-10 | 2007-02-15 | Cisco Technology, Inc. | Method and system for automatic configuration of virtual talk groups based on location of media sources |
US20100197333A1 (en) * | 2005-08-10 | 2010-08-05 | Cisco Technology, Inc. | Method and System for Communicating Media Based on Location of Media Source |
US7636339B2 (en) | 2005-08-10 | 2009-12-22 | Cisco Technology, Inc. | Method and system for automatic configuration of virtual talk groups based on location of media sources |
US8472418B2 (en) | 2005-08-10 | 2013-06-25 | Cisco Technology, Inc. | Method and system for communicating media based on location of media source |
US20070047479A1 (en) * | 2005-08-29 | 2007-03-01 | Cisco Technology, Inc. | Method and system for conveying media source location information |
US7869386B2 (en) | 2005-08-29 | 2011-01-11 | Cisco Technology, Inc. | Method and system for conveying media source location information |
US8260338B2 (en) * | 2006-02-28 | 2012-09-04 | Cisco Technology, Inc. | Method and system for providing interoperable communications with dynamic event area allocation |
US20070202908A1 (en) * | 2006-02-28 | 2007-08-30 | Cisco Technology, Inc. | Method and system for providing interoperable communications with dynamic event area allocation |
US8397180B2 (en) | 2006-12-21 | 2013-03-12 | Canon Kabushiki Kaisha | Scrolling browser with previewing area |
US8307305B2 (en) | 2006-12-21 | 2012-11-06 | Canon Kabushiki Kaisha | Scrolling interface |
US8856684B2 (en) * | 2006-12-21 | 2014-10-07 | Canon Kabushiki Kaisha | Scrolling interface |
US20080150892A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Collection browser for image items with multi-valued attributes |
US20080155475A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Scrolling interface |
US20080155474A1 (en) * | 2006-12-21 | 2008-06-26 | Canon Kabushiki Kaisha | Scrolling interface |
US10606470B2 (en) | 2007-01-07 | 2020-03-31 | Apple, Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US10983692B2 (en) | 2007-01-07 | 2021-04-20 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US9619132B2 (en) | 2007-01-07 | 2017-04-11 | Apple Inc. | Device, method and graphical user interface for zooming in on a touch-screen display |
US11886698B2 (en) | 2007-01-07 | 2024-01-30 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US11461002B2 (en) | 2007-01-07 | 2022-10-04 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US11269513B2 (en) | 2007-01-07 | 2022-03-08 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
WO2010026044A1 (en) * | 2008-09-03 | 2010-03-11 | Volkswagen Ag | Method and device for displaying information, in particular in a vehicle |
US8875044B2 (en) | 2008-11-19 | 2014-10-28 | Sony Corporation | Image processing apparatus, image display method, and image display program |
US9063646B2 (en) * | 2008-11-19 | 2015-06-23 | Sony Corporation | Image processing apparatus, image processing method, and image display program |
US20100125786A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image display method, and image display program |
US20100123734A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Corporation | Image processing apparatus, image processing method, and image display program |
US8970630B2 (en) * | 2009-05-12 | 2015-03-03 | Sony Corporation | Information processing device, information processing method, and information processing program |
US20110134126A1 (en) * | 2009-05-12 | 2011-06-09 | Reiko Miyazaki | Information processing device, information processing method, and information processing program |
US8466934B2 (en) * | 2009-06-29 | 2013-06-18 | Min Liang Tan | Touchscreen interface |
US20100328351A1 (en) * | 2009-06-29 | 2010-12-30 | Razer (Asia-Pacific) Pte Ltd | User interface |
US9250791B2 (en) * | 2009-07-28 | 2016-02-02 | Sony Corporation | Display control device, display control method, and computer program |
US20120127107A1 (en) * | 2009-07-28 | 2012-05-24 | Ken Miyashita | Display control device, display control method, and computer program |
US9696809B2 (en) | 2009-11-05 | 2017-07-04 | Will John Temple | Scrolling and zooming of a portable device display with device motion |
US20110102455A1 (en) * | 2009-11-05 | 2011-05-05 | Will John Temple | Scrolling and zooming of a portable device display with device motion |
WO2011056209A1 (en) * | 2009-11-05 | 2011-05-12 | Will John Temple | Scrolling and zooming of a portable device display with motion |
US20110119578A1 (en) * | 2009-11-17 | 2011-05-19 | Schwartz Michael U | Method of scrolling items on a touch screen user interface |
KR101863654B1 (en) | 2010-11-03 | 2018-06-04 | 삼성전자 주식회사 | Controlling Method For Changing Screen based on a input signal And Portable Device supporting the same |
US9110582B2 (en) * | 2010-11-03 | 2015-08-18 | Samsung Electronics Co., Ltd. | Mobile terminal and screen change control method based on input signals for the same |
EP2450781A3 (en) * | 2010-11-03 | 2013-03-27 | Samsung Electronics Co., Ltd. | Mobile terminal and screen change control method based on input signals for the same |
US20120110501A1 (en) * | 2010-11-03 | 2012-05-03 | Samsung Electronics Co. Ltd. | Mobile terminal and screen change control method based on input signals for the same |
KR20120047195A (en) * | 2010-11-03 | 2012-05-11 | 삼성전자주식회사 | Controlling method for changing screen based on a input signal and portable device supporting the same |
CN102566897A (en) * | 2010-11-03 | 2012-07-11 | 三星电子株式会社 | Mobile terminal and method for changing and controlling screen based on input signal of the same |
JP2013097426A (en) * | 2011-10-28 | 2013-05-20 | Nintendo Co Ltd | Information processing program, information processing device, information processing system, and information processing method |
JP2014194773A (en) * | 2013-03-28 | 2014-10-09 | Samsung Electronics Co Ltd | Display method for display apparatus, and display apparatus |
DE102016212139A1 (en) * | 2016-07-04 | 2018-01-04 | Bayerische Motoren Werke Aktiengesellschaft | Method for displaying data elements, operating device for a vehicle, and vehicle comprising the operating device |
US11244427B2 (en) * | 2018-04-27 | 2022-02-08 | Tencent Technology (Shenzhen) Company Ltd | Image resolution processing method, system, and apparatus, storage medium, and device |
Also Published As
Publication number | Publication date |
---|---|
US6747680B1 (en) | 2004-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6747680B1 (en) | Speed-dependent automatic zooming interface | |
US9086791B2 (en) | Methods, systems, and media for providing content-aware scrolling | |
CN1848081B (en) | User interface systems and methods for viewing and manipulating digital documents | |
US7256801B2 (en) | Elastic presentation space | |
US6816174B2 (en) | Method and apparatus for variable density scroll area | |
TW426831B (en) | Digital map display zooming method, digital map display zooming device, and storage medium for storing digital map display zooming program | |
US6411274B2 (en) | Digital map display zooming method, digital map display zooming device, and storage medium for storing digital map display zooming program | |
EP0881563B1 (en) | Digital map display scrolling method and device | |
US8112705B2 (en) | Magnifying the text of a link while still retaining browser function in the magnified display | |
US7661072B2 (en) | Accelerated scrolling | |
US20130254659A1 (en) | Visual Screen Indicator | |
US10353533B2 (en) | Manipulating visual representations of data | |
WO2007002134A2 (en) | Interactive scaling feature having scalability in three dimensional space | |
US20020109687A1 (en) | Visibility and usability of displayed images | |
JP2009181569A6 (en) | Information display method, program, and information display system | |
WO2008024182A2 (en) | Choosing ranges from a spiral scale display | |
Sun et al. | Flipper: a new method of digital document navigation | |
JPH0212516A (en) | Actual dimension display system | |
Lee et al. | Enhancing web accessibility | |
Mitchell | Focus+ context screens: A study and evaluation | |
Krum et al. | Supporting Interaction as a Secondary Task in Geo-Spatial Applications | |
TH44516A (en) | Dynamically scalable 3D images for browser windows. Or linking with users via diagram | |
CA2425990A1 (en) | Elastic presentation space |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IGARASHI, TAKEO;HORVITZ, ERIC;HINCKLEY, KENNETH P.;REEL/FRAME:014983/0086;SIGNING DATES FROM 20000412 TO 20000428 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0001 Effective date: 20141014 |