US20110138320A1 - Peek Around User Interface - Google Patents

Peek Around User Interface Download PDF

Info

Publication number
US20110138320A1
US20110138320A1 US13/019,770 US201113019770A US2011138320A1 US 20110138320 A1 US20110138320 A1 US 20110138320A1 US 201113019770 A US201113019770 A US 201113019770A US 2011138320 A1 US2011138320 A1 US 2011138320A1
Authority
US
United States
Prior art keywords
viewing angle
desktop
display screen
viewing
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/019,770
Inventor
David P. Vronay
Lili Cheng
Baining Guo
Sean U. Kelly
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/019,770 priority Critical patent/US20110138320A1/en
Publication of US20110138320A1 publication Critical patent/US20110138320A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the present invention relates to graphical user interfaces for computer operating systems and, in particular, to a graphical user interface that may be rendered according to different views to provide an enlarged operating system desktop.
  • GUI graphical user interface
  • the shell is a piece of software (either a separate program or component part of the operating system) that provides direct communication between the user and the operating system.
  • the graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system.
  • the graphical user interface of many operating system shells is based on a desktop metaphor that creates a graphical environment simulating work at a desk. These graphical user interfaces typically employ a windowing environment with the desktop.
  • the windowing environment presents the user with specially delineated areas of the screen called windows, each of which is dedicated to a particular application program, file or document. Each window can act independently, as if it were a virtual display device under control of its particular application program. Windows can typically be resized, moved around the display, and stacked so as to overlay another. In some windowing environments, windows can be minimized to an icon or increased to a full-screen display.
  • Windows may be rendered beside each other or may have a top to bottom order in which they are displayed, with top windows at a particular location on the screen overlaying any other window at that same location according to a z-order (an order of the windows along a conceptual z-axis normal to the desktop or display screen).
  • the top-most window has the “focus” and accepts the user's input.
  • the user can switch other windows to the top (and thereby change the z-order) by clicking on the window with a mouse or other pointer device, or by inputting certain key combinations. This allows the user to work with multiple application programs, files and documents in a manner similar to physically working with multiple paper documents and items that can be arbitrarily stacked or arranged on an actual desk.
  • objects e.g., windows
  • morphing objects
  • objects are quickly transformed into smaller representations or symbols to reduce the amount of display screen area they require. For example, a window may be minimized to a symbol that is rendered on a task bar along on edge of the display screen. The working size f the object may then be re-generated by selecting or activating the symbol.
  • some objects are accessed from an unrendered, off-screen region by scrolling the objects into the fixed display screen area.
  • the user could be provided a graphical user interface affordance (such as a scroll bar) with which the off-screen objects are to moved into view.
  • a user interface affordance e.g., a menu name
  • pop-ups/drop-downs a user interface affordance
  • an overlay of other elements such as a window full of menu items that are separately selectable.
  • this overlay is easily dismissed from the display screen.
  • drawers a user interface affordance at the edge of a display screen or window can be pulled out to reveal an overlay of objects or menu items, in the manner of a cabinet drawer.
  • the user can control the amount of the drawer that is pulled out to reveal more or fewer of the objects.
  • Such prior implementations attempting to compensate for the fixed and limited extent of display screens may be characterized as allowing a user either to move objects onto the fixed display screen area (e.g., as in scrolling or pop-ups/drop-downs or drawers) or moving objects from the display screen or reducing their size (e.g., morphing).
  • the fixed and limited extent of display screens may be effectively extended or enlarged by providing different views of an underlying desktop object.
  • the present invention provides an operating system shell with an underlying desktop object that is rendered according to different views.
  • the operating system shell renders on a display screen a desktop graphical user interface with windows, tools, icons, etc. that are within a segment of the desktop object that can be observed (i.e., rendered) from one of the views.
  • the desktop object is of an extent that is greater than can be rendered from a single view. Allowing a user to select or access different views of the desktop object effectively provides an extended desktop that overcomes the fixed and limited display capabilities of conventional operating system shells.
  • a variable viewing angle interface is rendered in accordance with first and second viewing angles, the first viewing angle being perpendicular to the desktop object and the second viewing angle being non-perpendicular to the desktop object.
  • a user-controlled viewing selection corresponding to one of perpendicular and angled views is obtained and encompasses one of respective first and second regions of the desktop object.
  • the operating system graphical user interface is rendered as a three-dimensional image transformation of the desktop object in accordance with the selected view.
  • the present invention allows use of a desktop object that is larger than or extended relative a conventional display screen. Changes between the different views, such as making the change from the perpendicular view to the angled view, is akin to taking a “peek” around an obstruction, in this case the edge of a display screen. Accordingly, this use of different image transformation representations to provide different views of a desktop object may sometimes be referred to as a “peek-around” user interface that quickly reveals portions of desktop object that would normally not be seen.
  • FIG. 1 is a block diagram of a computer system that may be used to implement the present invention.
  • FIG. 2 is a diagram illustrating a desktop-based graphical user interface with a perpendicular view of an underlying desktop according to the present invention.
  • FIG. 3 is a top plan view of an image transformation representation corresponding to the perpendicular view of the desktop of FIG. 2 .
  • FIG. 4 is a diagram illustrating graphical user interface with an angled-view of an underlying desktop according to the present invention.
  • FIG. 5 is a top plan view of an image transformation representation corresponding to the angled view of the desktop of FIG. 4 .
  • FIG. 6 is an image transformation representation illustrating a perpendicular view of a desktop with a non-planar, stepped desktop object.
  • FIG. 7 is an image transformation representation illustrating a perpendicular view of a desktop with a non-planar desktop object having inclined segments.
  • FIGS. 8A and 8B are image transformation representations illustrating perpendiculars a planar desktop object at different first and second image distances.
  • FIG. 9 is a flow diagram of a desktop shell rendering method for selectively generating different views of a desktop-based graphical user interface.
  • FIG. 1 illustrates an operating environment for an embodiment of the present invention as a computer system 20 with a computer 22 that comprises at least one high speed processing unit (CPU) 24 in conjunction with a memory system 26 , an input device 28 , and an output device 30 . These elements are interconnected by at least one bus structure 32 .
  • CPU high speed processing unit
  • the illustrated CPU 24 is of familiar design and includes an ALU 34 for performing computations, a collection of registers 36 for temporary storage of data and instructions, and a control unit 38 for controlling operation of the system 20 .
  • the CPU 24 may be a processor having any of a variety of architectures including Alpha from Digital, MIPS from MIPS Technology, NEC, IDT, Siemens, and others, x86 from Intel and others, including Cyrix, AMD, and Nexgen, and the PowerPC from IBM and Motorola.
  • the memory system 26 generally includes high-speed main memory 40 in the form of a medium such as random access memory (RAM) and read only memory (ROM) semiconductor devices, and secondary storage 42 in the form of long term storage mediums such as floppy disks, hard disks, tape, CD-ROM, flash memory, etc. and other devices that store data using electrical, magnetic, optical or other recording media.
  • main memory 40 also can include video display memory for displaying images through a display device.
  • the memory 26 can comprise a variety of alternative components having a variety of storage capacities.
  • the input and output devices 28 and 30 also are familiar.
  • the input device 28 can comprise a keyboard, a mouse, a physical transducer (e.g., a microphone), etc.
  • the output device 30 can comprise a display, a printer, a transducer (e.g., a speaker), etc.
  • Some devices, such as a network interface or a modem can be used as input and/or output devices.
  • the computer system 20 further includes an operating system 44 and typically at least one application program 46 .
  • Operating system 44 is the set of software that controls the computer system operation and the allocation of resources.
  • Application program 46 is the set of software that performs a task desired by the user, using computer resources made available through operating system 44 . Both are resident in the illustrated memory system 26 .
  • Operating system 44 has a shell 48 that provides a graphical user interface (GUI).
  • GUI graphical user interface
  • the shell 48 is a piece of software (either a separate program or component part of the operating system) that provides direct communication between the user and operating system 44 .
  • the graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system.
  • the graphical user interface of many operating system shells is based on or referred to as a desktop metaphor in which a graphical environment simulates working at a desk. These graphical user interfaces typically employ a windowing environment within the desktop metaphor.
  • FIG. 2 is a diagram illustrating a desktop-based graphical user interface 50 with a perpendicular view of an underlying desktop 52 over which are rendered windows 54 and 56 and a portion of a window 58 . (An unrendered portion of window 58 is indicated by dashed lines.) It will be appreciated that any number of windows could be rendered on desktop 52 . Windows 54 - 58 are rendered by shell 48 and allow a user to interact with operating system 44 or an application 46 running on operating system 44 .
  • Desktop-based graphical user interface 50 provides a plan view of desktop 52 and windows 54 - 58 .
  • the desktop 52 and windows 54 - 58 are represented as being in one or more planes that are perpendicular to a predefined line of vision from a user.
  • FIG. 3 is a top plan view of an image transformation representation 70 corresponding to the perpendicular view of desktop 52 in graphical user interface 50 .
  • Image transformation representation 70 includes a viewpoint 72 (indicated schematically as an image plane 72 a a camera 72 ) with a viewing range 74 and a perpendicular orientation to an extended desktop object 76 .
  • the perpendicular orientation of viewpoint 72 encompasses a central segment 78 of extended desktop object 76 and omits lateral segments 80 and 82 of extended desktop object 76 .
  • Image transformation representation 70 illustrates that the appearance of desktop 52 rendered on a computer display screen is based upon a three-dimensional image transformation in accordance with the present invention. Accordingly, desktop 52 corresponds to a view of desktop object 76 at viewpoint 72 having a perpendicular orientation.
  • Such an image transformation may be generated by a conventional transformation matrix representing a three-dimensional rotation about a Y-axis and being of the form:
  • A is the angle of rotation.
  • the matrix M is multiplied by a matrix corresponding to an object being rendered (e.g., a window and any features to be rendered within it) to generate the resulting view, as is known in the art of three-dimensional rendering. While it is sometimes used in applications that provide three-dimensional spatial representations, this type of three-dimensional projection transformation calculation is not the typical basis used by a shell 48 to generate a desktop graphical user interface.
  • the perpendicular view of desktop 52 may have an appearance similar to that of a conventional desktop graphical user interface. It will be appreciated, however, that perpendicular view of desktop 52 is generated in a manner different from that of a conventional desktop graphical user interface.
  • the three-dimensional projection transformation calculation above is used to generate both the perpendicular and angled views of desktop-based graphical user interface 50 .
  • a conventional desktop style graphical user interface is typically generated as a simple two-dimensional representation that is incapable of accommodating the different viewing angles provided by the present invention.
  • FIG. 4 is a diagram illustrating graphical user interface 50 with an angled-view of underlying desktop 52 over which are rendered windows 54 , 56 , 58 , and 60 .
  • the angled-views of windows 54 - 60 are rendered by the shell 48 of operating system 44 and provided an extended view of desktop 52 that allows the user to interact with operating system 44 or an application running 46 on operating system 44 .
  • the desktop 52 and windows 54 - 60 are represented as being in one or more planes that are not perpendicular to a predefined line of vision from a user.
  • the angled-view is angled laterally relative to the perpendicular view.
  • the desktop 52 and windows 54 - 60 are represented as having a non-perpendicular orientation to a central predefined line of vision from viewpoint 72 to the display screen.
  • windows 45 - 60 are rendered with a parallax that causes the otherwise rectangular windows 54 - 60 to have trapezoidal shapes. It will be appreciated that the parallax of windows 54 - 60 in FIG. 4 would also affect any graphics, images, text, etc. rendered within windows 54 - 60 .
  • FIG. 5 is a top plan view of an image transformation representation 100 corresponding to the angled view of desktop 52 in graphical user interface 50 .
  • Image transformation representation 100 includes a viewpoint 102 with a viewing range 104 and a laterally non-perpendicular orientation to desktop object 76 .
  • Viewing range 104 established by the non-perpendicular orientation of viewpoint 72 encompasses a major side desktop segment 106 .
  • a second minor side desktop segment 108 is not included in viewing range 104 .
  • Image transformation representations 70 and 100 allow desktop object 76 to be larger than or extended relative a conventional desktop object.
  • the pivoting or rotation distinguishing viewpoints 72 and 102 makes the change from the perpendicular view to the angled view akin to taking a “peek” around an obstruction, in this case the edge of a display screen. Accordingly, this use of different image transformation representations to provide different views of a desktop object may sometimes be referred to as a “peek-around” user interface that quickly reveals portions of desktop object that would normally not be seen.
  • graphical user interface 50 of the present invention allows a user to manipulate and move windows rendered on desktop 52 .
  • users may move windows between central segment 78 corresponding to the perpendicular view of FIGS. 2 and 3 and segments 80 and 82 that can be encompassed within angled views.
  • An optional aspect of graphical user interface 50 is that users could move windows between central segment 78 and segments 80 and 82 with keystroke or cursor controller (e.g., mouse) actions. For example, a window that is in one of segments 80 and 82 and rendered in an angled view of desktop object 76 could be moved to central segment 78 by a user selecting or activating the window. Likewise, a window that is in central segment 78 and rendered in the perpendicular view of desktop object 76 could be moved to one of segments 80 and 82 by a predefined keyboard action by the user or by the user dragging a predefined portion of the window beyond a margin of the display screen.
  • keystroke or cursor controller e.g., mouse
  • Extended desktop object 76 in FIGS. 3 and 5 is represented as a planar image surface that is generally parallel to the display screen on which desktop 52 is rendered.
  • Other aspects of the present invention are that extended desktop objects of other configurations may be used and that image transformation representations other than viewpoint rotation may be used to access and render marginal segments of an extended desktop object.
  • FIG. 6 is an image transformation representation 120 illustrating a perpendicular view of a desktop (not shown) in a graphical user interface (not shown).
  • Image transformation representation 120 includes a viewpoint 126 with a viewing range 128 extending over a planar central segment 130 of a non-planar, stepped desktop object 132 .
  • Non-planar desktop object 132 further includes lateral segments 134 and 136 that are generally parallel to central segment 130 , but correspond to a depth or distance 138 from viewpoint 126 greater than depth or distance 140 to central segment 130 .
  • Depth or distance 138 of lateral segments 134 and 136 causes windows (not shown) that are position within segments 134 and 136 to appear farther from viewpoint 126 and, as a result, are rendered with a correspondingly smaller size that allows more objects (e.g., windows) to be rendered or discerned. It will be appreciated that the generation or rendering of windows or other objects in lateral segments 134 and 136 , in comparison to the rendering in central segment 130 , is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 7 is an image transformation representation 150 illustrating a perpendicular view of a desktop (not shown) in a graphical user interface (not shown).
  • Image transformation representation 150 includes a viewpoint 156 with a viewing range 158 extending over a planar central segment 160 of a non-planar desktop object 162 .
  • Non-planar desktop object 162 further includes lateral segments 164 and 166 that are inclined (i.e., generally not parallel) relative to central segment 160 , and correspond to a depth or distance 168 from viewpoint 156 typically greater than depth or distance 170 to central segment 160 .
  • Lateral segment 164 includes a pair of oppositely inclined regions 172 and 174 , with inner region 172 being positioned between central segment 160 and outer region 174 .
  • lateral segment 166 includes a pair of oppositely inclined regions 176 and 178 , with inner region 176 being positioned between central segment 160 and outer region 178 .
  • inner inclined regions 172 and 176 are of generally the same size and inclination as outer regions 174 and 178 , respectively. It will be appreciated, however, that inner regions 172 and 176 could be of size or inclination that differ from those of regions 174 and 174 . For example, inner regions 172 and 176 could be shorter and steeper than regions 174 and 174 . It will be appreciated that the generation or rendering of windows or other objects in lateral segments 164 and 166 , in comparison to the rendering in central segment 130 , is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • inner regions 172 and 176 will result in any windows rendered in those regions to have a greater parallax than windows rendered with reference to windows rendered in lateral segments of non-inclined desktop object (e.g., FIGS. 4 and 5 ).
  • outer regions 174 and 178 will result in any windows rendered in those regions being rendered with little or no parallax. It will be appreciated, therefore, that relatively steep, narrow inner regions 172 and 176 could provide visual transitions to wider, extended outer regions 174 and 178 to give a user an extended, parallax-free desktop.
  • the non-planar desktop object 162 of graphical user interface 154 is merely one example illustrating that graphical user interfaces of the present invention could employ a variety of non-planar desktop objects.
  • Alternative desktop objects could employ other combinations of flat segments, as illustrated, or could employ segments with smooth or continuous configurations. It will be appreciated that the generation or rendering of windows or other objects on such desktop objects, in comparison to the rendering in central segment 130 , is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 8A is an image transformation representation 180 illustrating a first perpendicular view of a desktop (not shown) on a desktop object 186 in a graphical user interface (not shown).
  • Image transformation representation 180 includes a viewpoint 190 that is a first distance 192 from desktop object 186 and includes a viewing range 192 extending over a central segment 194 . Lateral segments 196 and 198 of desktop object 186 are not included within viewing range 192 .
  • FIG. 8B is an image transformation representation 200 illustrating a second perpendicular view of desktop (not shown) on desktop object 186 in graphical user interface (not shown).
  • Image transformation representation 200 includes viewpoint 190 that is a second distance 204 from desktop object 186 and includes a viewing range 206 extending over all of desktop object 186 .
  • Second distance 204 between viewpoint 190 and desktop object 194 is greater than first distance 192 so that viewing range 206 encompasses desktop object 186 while viewing range 192 encompasses only central segment 194 .
  • Image transformation representations 180 and 200 illustrate that the use of three-dimensional image transformations for rendering operating system displays may extend beyond lateral rotations. It will be appreciated that the generation or rendering of windows or other objects in image transformation representations 180 and 200 is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 9 is a flow diagram of a desktop shell rendering method 220 for selectively generating perpendicular and angled views of desktop-based graphical user interface 50 . It will be appreciated that method 220 is similarly applicable to generating alternative desktop views described with reference to FIGS. 6-8 , and other alternative desktop views as well.
  • Process block 222 indicates that an extended desktop object (e.g., extended desktop object 76 ) is defined to have at least one dimension greater than a corresponding display screen.
  • the extended desktop object may have only a lateral dimension that is greater than a corresponding display screen dimension, as with exemplary extended desktop object 76 .
  • the extended desktop object may have only a vertical dimension that is greater than a corresponding display screen dimension, or may have both a lateral and a vertical dimension that are greater than the corresponding display screen dimensions.
  • Process block 224 indicates that a viewpoint (e.g., viewpoint 72 ) is established for determining a view of the desktop object.
  • a viewpoint e.g., viewpoint 72
  • Process block 226 indicates that a viewing angle is selected between the viewpoint and the extended desktop object.
  • a viewing angle is selected between the viewpoint and the extended desktop object.
  • a default perpendicular viewing angle may be defined.
  • An angled, non-perpendicular viewing angle may be selected either upon a specific user command or automatically upon a user positioning a cursor at or within a predefined distance of a side margin of the display screen.
  • eye pupil motion detection may be employed to detect a user looking to a side margin of a display.
  • Process block 228 indicates that a desktop graphical user interface is rendered in accordance with the selected viewing angle.

Abstract

An operating system shell has an underlying desktop object that is rendered according to different views. The operating system shell renders on a display screen a desktop graphical user interface with windows, tools, icons, etc. that are within a segment of the desktop object that can be observed (i.e., rendered) from one of the views. In illustrated implementations, the desktop object is of an extent that is greater than can be rendered from a single view. Allowing a user to select or access different views of the desktop object effectively provides an extended desktop that overcomes the fixed and limited display capabilities of conventional operating system shells.

Description

    RELATED APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 10/112,394, filed on Mar. 29, 2002, which is incorporated by reference herein in its entirety.
  • FIELD OF THE INVENTION
  • The present invention relates to graphical user interfaces for computer operating systems and, in particular, to a graphical user interface that may be rendered according to different views to provide an enlarged operating system desktop.
  • BACKGROUND AND SUMMARY OF THE INVENTION
  • It is now common for operating systems to have a shell that provides a graphical user interface (GUI). The shell is a piece of software (either a separate program or component part of the operating system) that provides direct communication between the user and the operating system. The graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system.
  • The graphical user interface of many operating system shells is based on a desktop metaphor that creates a graphical environment simulating work at a desk. These graphical user interfaces typically employ a windowing environment with the desktop.
  • The windowing environment presents the user with specially delineated areas of the screen called windows, each of which is dedicated to a particular application program, file or document. Each window can act independently, as if it were a virtual display device under control of its particular application program. Windows can typically be resized, moved around the display, and stacked so as to overlay another. In some windowing environments, windows can be minimized to an icon or increased to a full-screen display.
  • Windows may be rendered beside each other or may have a top to bottom order in which they are displayed, with top windows at a particular location on the screen overlaying any other window at that same location according to a z-order (an order of the windows along a conceptual z-axis normal to the desktop or display screen). The top-most window has the “focus” and accepts the user's input. The user can switch other windows to the top (and thereby change the z-order) by clicking on the window with a mouse or other pointer device, or by inputting certain key combinations. This allows the user to work with multiple application programs, files and documents in a manner similar to physically working with multiple paper documents and items that can be arbitrarily stacked or arranged on an actual desk.
  • Typically, the physical dimensions of computer display screen are much more limited than the desires of users to have different windows, tools, icons, etc. rendered simultaneously and the ability of operating system shells to do so. The result is that the limited extent of display screen “real estate” can limit the ability of operating system shells to render multiple windows, tools, icons, etc. simultaneously.
  • A variety of prior implementations have attempted to compensate for the fixed and limited extent of display screens. In one prior implementation referred to as morphing, objects (e.g., windows) are quickly transformed into smaller representations or symbols to reduce the amount of display screen area they require. For example, a window may be minimized to a symbol that is rendered on a task bar along on edge of the display screen. The working size f the object may then be re-generated by selecting or activating the symbol.
  • In another prior implementation referred to as scrolling, some objects (e.g., windows) are accessed from an unrendered, off-screen region by scrolling the objects into the fixed display screen area. For example, the user could be provided a graphical user interface affordance (such as a scroll bar) with which the off-screen objects are to moved into view.
  • In yet another prior implementation referred to as pop-ups/drop-downs, a user interface affordance (e.g., a menu name) is acted on by user to produce an overlay of other elements such as a window full of menu items that are separately selectable. Typically, this overlay is easily dismissed from the display screen. Finally, in still another prior implementation referred to as drawers, a user interface affordance at the edge of a display screen or window can be pulled out to reveal an overlay of objects or menu items, in the manner of a cabinet drawer. Typically the user can control the amount of the drawer that is pulled out to reveal more or fewer of the objects.
  • Such prior implementations attempting to compensate for the fixed and limited extent of display screens may be characterized as allowing a user either to move objects onto the fixed display screen area (e.g., as in scrolling or pop-ups/drop-downs or drawers) or moving objects from the display screen or reducing their size (e.g., morphing). As aspect of the present invention is that the fixed and limited extent of display screens may be effectively extended or enlarged by providing different views of an underlying desktop object.
  • The present invention provides an operating system shell with an underlying desktop object that is rendered according to different views. The operating system shell renders on a display screen a desktop graphical user interface with windows, tools, icons, etc. that are within a segment of the desktop object that can be observed (i.e., rendered) from one of the views. In illustrated implementations, the desktop object is of an extent that is greater than can be rendered from a single view. Allowing a user to select or access different views of the desktop object effectively provides an extended desktop that overcomes the fixed and limited display capabilities of conventional operating system shells.
  • In one implementation, a variable viewing angle interface is rendered in accordance with first and second viewing angles, the first viewing angle being perpendicular to the desktop object and the second viewing angle being non-perpendicular to the desktop object. A user-controlled viewing selection corresponding to one of perpendicular and angled views is obtained and encompasses one of respective first and second regions of the desktop object. The operating system graphical user interface is rendered as a three-dimensional image transformation of the desktop object in accordance with the selected view.
  • The present invention allows use of a desktop object that is larger than or extended relative a conventional display screen. Changes between the different views, such as making the change from the perpendicular view to the angled view, is akin to taking a “peek” around an obstruction, in this case the edge of a display screen. Accordingly, this use of different image transformation representations to provide different views of a desktop object may sometimes be referred to as a “peek-around” user interface that quickly reveals portions of desktop object that would normally not be seen.
  • Additional objects and advantages of the present invention will be apparent from the detailed description of the preferred embodiment thereof, which proceeds with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a computer system that may be used to implement the present invention.
  • FIG. 2 is a diagram illustrating a desktop-based graphical user interface with a perpendicular view of an underlying desktop according to the present invention.
  • FIG. 3 is a top plan view of an image transformation representation corresponding to the perpendicular view of the desktop of FIG. 2.
  • FIG. 4 is a diagram illustrating graphical user interface with an angled-view of an underlying desktop according to the present invention.
  • FIG. 5 is a top plan view of an image transformation representation corresponding to the angled view of the desktop of FIG. 4.
  • FIG. 6 is an image transformation representation illustrating a perpendicular view of a desktop with a non-planar, stepped desktop object.
  • FIG. 7 is an image transformation representation illustrating a perpendicular view of a desktop with a non-planar desktop object having inclined segments.
  • FIGS. 8A and 8B are image transformation representations illustrating perpendiculars a planar desktop object at different first and second image distances.
  • FIG. 9 is a flow diagram of a desktop shell rendering method for selectively generating different views of a desktop-based graphical user interface.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • FIG. 1 illustrates an operating environment for an embodiment of the present invention as a computer system 20 with a computer 22 that comprises at least one high speed processing unit (CPU) 24 in conjunction with a memory system 26, an input device 28, and an output device 30. These elements are interconnected by at least one bus structure 32.
  • The illustrated CPU 24 is of familiar design and includes an ALU 34 for performing computations, a collection of registers 36 for temporary storage of data and instructions, and a control unit 38 for controlling operation of the system 20. The CPU 24 may be a processor having any of a variety of architectures including Alpha from Digital, MIPS from MIPS Technology, NEC, IDT, Siemens, and others, x86 from Intel and others, including Cyrix, AMD, and Nexgen, and the PowerPC from IBM and Motorola.
  • The memory system 26 generally includes high-speed main memory 40 in the form of a medium such as random access memory (RAM) and read only memory (ROM) semiconductor devices, and secondary storage 42 in the form of long term storage mediums such as floppy disks, hard disks, tape, CD-ROM, flash memory, etc. and other devices that store data using electrical, magnetic, optical or other recording media. The main memory 40 also can include video display memory for displaying images through a display device. Those skilled in the art will recognize that the memory 26 can comprise a variety of alternative components having a variety of storage capacities.
  • The input and output devices 28 and 30 also are familiar. The input device 28 can comprise a keyboard, a mouse, a physical transducer (e.g., a microphone), etc. The output device 30 can comprise a display, a printer, a transducer (e.g., a speaker), etc. Some devices, such as a network interface or a modem can be used as input and/or output devices.
  • As is familiar to those skilled in the art, the computer system 20 further includes an operating system 44 and typically at least one application program 46. Operating system 44 is the set of software that controls the computer system operation and the allocation of resources. Application program 46 is the set of software that performs a task desired by the user, using computer resources made available through operating system 44. Both are resident in the illustrated memory system 26.
  • In accordance with the practices of persons skilled in the art of computer programming, the present invention is described below with reference to acts and symbolic representations of operations that are performed by computer system 20, unless indicated otherwise. Such acts and operations are sometimes referred to as being computer-executed and may be associated with the operating system or the application program as appropriate. It will be appreciated that the acts and symbolically represented operations include the manipulation by the CPU 24 of electrical signals representing data bits which causes a resulting transformation or reduction of the electrical signal representation, and the maintenance of data bits at memory locations in memory system 26 to thereby reconfigure or otherwise alter the computer system's operation, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, or optical properties corresponding to the data bits.
  • Operating system 44 has a shell 48 that provides a graphical user interface (GUI). The shell 48 is a piece of software (either a separate program or component part of the operating system) that provides direct communication between the user and operating system 44. The graphical user interface typically provides a graphical icon-oriented and/or menu driven environment for the user to interact with the operating system. The graphical user interface of many operating system shells is based on or referred to as a desktop metaphor in which a graphical environment simulates working at a desk. These graphical user interfaces typically employ a windowing environment within the desktop metaphor.
  • FIG. 2 is a diagram illustrating a desktop-based graphical user interface 50 with a perpendicular view of an underlying desktop 52 over which are rendered windows 54 and 56 and a portion of a window 58. (An unrendered portion of window 58 is indicated by dashed lines.) It will be appreciated that any number of windows could be rendered on desktop 52. Windows 54-58 are rendered by shell 48 and allow a user to interact with operating system 44 or an application 46 running on operating system 44.
  • Desktop-based graphical user interface 50 provides a plan view of desktop 52 and windows 54-58. In the plan view, the desktop 52 and windows 54-58 are represented as being in one or more planes that are perpendicular to a predefined line of vision from a user.
  • FIG. 3 is a top plan view of an image transformation representation 70 corresponding to the perpendicular view of desktop 52 in graphical user interface 50. Image transformation representation 70 includes a viewpoint 72 (indicated schematically as an image plane 72 a a camera 72) with a viewing range 74 and a perpendicular orientation to an extended desktop object 76. The perpendicular orientation of viewpoint 72 encompasses a central segment 78 of extended desktop object 76 and omits lateral segments 80 and 82 of extended desktop object 76.
  • Image transformation representation 70 illustrates that the appearance of desktop 52 rendered on a computer display screen is based upon a three-dimensional image transformation in accordance with the present invention. Accordingly, desktop 52 corresponds to a view of desktop object 76 at viewpoint 72 having a perpendicular orientation. Such an image transformation may be generated by a conventional transformation matrix representing a three-dimensional rotation about a Y-axis and being of the form:
  • M = [ cos A 0 - sin A 0 0 1 0 0 sin A 0 cos A 0 0 0 0 1 ] ,
  • where A is the angle of rotation. The matrix M is multiplied by a matrix corresponding to an object being rendered (e.g., a window and any features to be rendered within it) to generate the resulting view, as is known in the art of three-dimensional rendering. While it is sometimes used in applications that provide three-dimensional spatial representations, this type of three-dimensional projection transformation calculation is not the typical basis used by a shell 48 to generate a desktop graphical user interface.
  • The perpendicular view of desktop 52 may have an appearance similar to that of a conventional desktop graphical user interface. It will be appreciated, however, that perpendicular view of desktop 52 is generated in a manner different from that of a conventional desktop graphical user interface. The three-dimensional projection transformation calculation above is used to generate both the perpendicular and angled views of desktop-based graphical user interface 50. In contrast, a conventional desktop style graphical user interface is typically generated as a simple two-dimensional representation that is incapable of accommodating the different viewing angles provided by the present invention.
  • FIG. 4 is a diagram illustrating graphical user interface 50 with an angled-view of underlying desktop 52 over which are rendered windows 54, 56, 58, and 60. The angled-views of windows 54-60 are rendered by the shell 48 of operating system 44 and provided an extended view of desktop 52 that allows the user to interact with operating system 44 or an application running 46 on operating system 44.
  • In the angled view of FIG. 4, the desktop 52 and windows 54-60 are represented as being in one or more planes that are not perpendicular to a predefined line of vision from a user. In the illustrated implementation, the angled-view is angled laterally relative to the perpendicular view. In the angled view, the desktop 52 and windows 54-60 are represented as having a non-perpendicular orientation to a central predefined line of vision from viewpoint 72 to the display screen. As a result, windows 45-60 are rendered with a parallax that causes the otherwise rectangular windows 54-60 to have trapezoidal shapes. It will be appreciated that the parallax of windows 54-60 in FIG. 4 would also affect any graphics, images, text, etc. rendered within windows 54-60.
  • FIG. 5 is a top plan view of an image transformation representation 100 corresponding to the angled view of desktop 52 in graphical user interface 50. Image transformation representation 100 includes a viewpoint 102 with a viewing range 104 and a laterally non-perpendicular orientation to desktop object 76. Viewing range 104 established by the non-perpendicular orientation of viewpoint 72 encompasses a major side desktop segment 106. A second minor side desktop segment 108 is not included in viewing range 104.
  • Image transformation representations 70 and 100 allow desktop object 76 to be larger than or extended relative a conventional desktop object. The pivoting or rotation distinguishing viewpoints 72 and 102 makes the change from the perpendicular view to the angled view akin to taking a “peek” around an obstruction, in this case the edge of a display screen. Accordingly, this use of different image transformation representations to provide different views of a desktop object may sometimes be referred to as a “peek-around” user interface that quickly reveals portions of desktop object that would normally not be seen.
  • As with conventional desktop-style graphical user interfaces, graphical user interface 50 of the present invention allows a user to manipulate and move windows rendered on desktop 52. For example, users may move windows between central segment 78 corresponding to the perpendicular view of FIGS. 2 and 3 and segments 80 and 82 that can be encompassed within angled views.
  • An optional aspect of graphical user interface 50 is that users could move windows between central segment 78 and segments 80 and 82 with keystroke or cursor controller (e.g., mouse) actions. For example, a window that is in one of segments 80 and 82 and rendered in an angled view of desktop object 76 could be moved to central segment 78 by a user selecting or activating the window. Likewise, a window that is in central segment 78 and rendered in the perpendicular view of desktop object 76 could be moved to one of segments 80 and 82 by a predefined keyboard action by the user or by the user dragging a predefined portion of the window beyond a margin of the display screen.
  • Extended desktop object 76 in FIGS. 3 and 5 is represented as a planar image surface that is generally parallel to the display screen on which desktop 52 is rendered. Other aspects of the present invention are that extended desktop objects of other configurations may be used and that image transformation representations other than viewpoint rotation may be used to access and render marginal segments of an extended desktop object.
  • FIG. 6 is an image transformation representation 120 illustrating a perpendicular view of a desktop (not shown) in a graphical user interface (not shown). Image transformation representation 120 includes a viewpoint 126 with a viewing range 128 extending over a planar central segment 130 of a non-planar, stepped desktop object 132. Non-planar desktop object 132 further includes lateral segments 134 and 136 that are generally parallel to central segment 130, but correspond to a depth or distance 138 from viewpoint 126 greater than depth or distance 140 to central segment 130.
  • Depth or distance 138 of lateral segments 134 and 136 causes windows (not shown) that are position within segments 134 and 136 to appear farther from viewpoint 126 and, as a result, are rendered with a correspondingly smaller size that allows more objects (e.g., windows) to be rendered or discerned. It will be appreciated that the generation or rendering of windows or other objects in lateral segments 134 and 136, in comparison to the rendering in central segment 130, is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 7 is an image transformation representation 150 illustrating a perpendicular view of a desktop (not shown) in a graphical user interface (not shown). Image transformation representation 150 includes a viewpoint 156 with a viewing range 158 extending over a planar central segment 160 of a non-planar desktop object 162. Non-planar desktop object 162 further includes lateral segments 164 and 166 that are inclined (i.e., generally not parallel) relative to central segment 160, and correspond to a depth or distance 168 from viewpoint 156 typically greater than depth or distance 170 to central segment 160.
  • Lateral segment 164 includes a pair of oppositely inclined regions 172 and 174, with inner region 172 being positioned between central segment 160 and outer region 174. Likewise, lateral segment 166 includes a pair of oppositely inclined regions 176 and 178, with inner region 176 being positioned between central segment 160 and outer region 178. In the illustrated implementation, inner inclined regions 172 and 176 are of generally the same size and inclination as outer regions 174 and 178, respectively. It will be appreciated, however, that inner regions 172 and 176 could be of size or inclination that differ from those of regions 174 and 174. For example, inner regions 172 and 176 could be shorter and steeper than regions 174 and 174. It will be appreciated that the generation or rendering of windows or other objects in lateral segments 164 and 166, in comparison to the rendering in central segment 130, is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • The inclinations of inner regions 172 and 176 will result in any windows rendered in those regions to have a greater parallax than windows rendered with reference to windows rendered in lateral segments of non-inclined desktop object (e.g., FIGS. 4 and 5). Conversely, the inclinations of outer regions 174 and 178 will result in any windows rendered in those regions being rendered with little or no parallax. It will be appreciated, therefore, that relatively steep, narrow inner regions 172 and 176 could provide visual transitions to wider, extended outer regions 174 and 178 to give a user an extended, parallax-free desktop.
  • The non-planar desktop object 162 of graphical user interface 154 is merely one example illustrating that graphical user interfaces of the present invention could employ a variety of non-planar desktop objects. Alternative desktop objects could employ other combinations of flat segments, as illustrated, or could employ segments with smooth or continuous configurations. It will be appreciated that the generation or rendering of windows or other objects on such desktop objects, in comparison to the rendering in central segment 130, is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 8A is an image transformation representation 180 illustrating a first perpendicular view of a desktop (not shown) on a desktop object 186 in a graphical user interface (not shown). Image transformation representation 180 includes a viewpoint 190 that is a first distance 192 from desktop object 186 and includes a viewing range 192 extending over a central segment 194. Lateral segments 196 and 198 of desktop object 186 are not included within viewing range 192.
  • FIG. 8B is an image transformation representation 200 illustrating a second perpendicular view of desktop (not shown) on desktop object 186 in graphical user interface (not shown). Image transformation representation 200 includes viewpoint 190 that is a second distance 204 from desktop object 186 and includes a viewing range 206 extending over all of desktop object 186. Second distance 204 between viewpoint 190 and desktop object 194 is greater than first distance 192 so that viewing range 206 encompasses desktop object 186 while viewing range 192 encompasses only central segment 194.
  • Image transformation representations 180 and 200 illustrate that the use of three-dimensional image transformations for rendering operating system displays may extend beyond lateral rotations. It will be appreciated that the generation or rendering of windows or other objects in image transformation representations 180 and 200 is readily accommodated by a depth factor in the conventional transformation matrix calculation for the display.
  • FIG. 9 is a flow diagram of a desktop shell rendering method 220 for selectively generating perpendicular and angled views of desktop-based graphical user interface 50. It will be appreciated that method 220 is similarly applicable to generating alternative desktop views described with reference to FIGS. 6-8, and other alternative desktop views as well.
  • Process block 222 indicates that an extended desktop object (e.g., extended desktop object 76) is defined to have at least one dimension greater than a corresponding display screen. For example, the extended desktop object may have only a lateral dimension that is greater than a corresponding display screen dimension, as with exemplary extended desktop object 76. Alternatively, the extended desktop object may have only a vertical dimension that is greater than a corresponding display screen dimension, or may have both a lateral and a vertical dimension that are greater than the corresponding display screen dimensions.
  • Process block 224 indicates that a viewpoint (e.g., viewpoint 72) is established for determining a view of the desktop object.
  • Process block 226 indicates that a viewing angle is selected between the viewpoint and the extended desktop object. As an example, a default perpendicular viewing angle may be defined. An angled, non-perpendicular viewing angle may be selected either upon a specific user command or automatically upon a user positioning a cursor at or within a predefined distance of a side margin of the display screen. Alternatively, eye pupil motion detection may be employed to detect a user looking to a side margin of a display.
  • Process block 228 indicates that a desktop graphical user interface is rendered in accordance with the selected viewing angle.
  • Having described and illustrated the principles of our invention with reference to an illustrated embodiment, it will be recognized that the illustrated embodiment can be modified in arrangement and detail without departing from such principles. It should be understood that the programs, processes, or methods described herein are not related or limited to any particular type of computer apparatus, unless indicated otherwise. Various types of general purpose or specialized computer apparatus may be used with or perform operations in accordance with the teachings described herein. Elements of the illustrated embodiment shown in software may be implemented in hardware and vice versa.
  • In view of the many possible embodiments to which the principles of our invention may be applied, it should be recognized that the detailed embodiments are illustrative only and should not be taken as limiting the scope of our invention. Rather, we claim as our invention all such embodiments as may come within the scope and spirit of the following claims and equivalents thereto.

Claims (20)

1. One or more computer-readable media storing instructions that, when executed on one or more processors, configure the one or more processors to present a graphical user interface, comprising:
providing a variable viewing angle interface presented, on a display screen with a surface, in accordance with at least first and second viewing angles, the first viewing angle being perpendicular to a desktop object that is presented parallel to the surface of the display screen and the second viewing angle being non-perpendicular to the desktop object, the variable viewing angle interface employing parallax comprising causing at least one rectangularly presented window object to be presented trapezoidally in the second viewing angle.
2. The one or more computer-readable media of claim 1, the second viewing angle being pivoted about a vertical axis relative to the first viewing angle to provide a different lateral view with respect to the desktop object.
3. The one or more computer-readable media of claim 1, further comprising providing a user-controlled viewing angle selector for selecting between the first and second viewing angles.
4. The one or more computer-readable media of claim 3, the user-controlled viewing angle selector providing a selection between the first and second viewing angles according to a position of an operating system-controlled cursor within a predefined distance from a margin of the display screen.
5. The one or more computer-readable media of claim 3, the user-controlled viewing angle selector providing a selection between the first and second viewing angles according to a position of an operating system cursor within a predefined distance from a side margin of the display screen.
6. The one or more computer-readable media of claim 1, further comprising transitioning from the first and second viewing angles effectuated by eye pupil motion detection of a user's eyes.
7. The one or more computer-readable media of claim 1, the first and second viewing angles being defined by a physical orientation of a line of sight directed from a user's eyes in relation to the display screen.
8. A system comprising:
memory;
one or more processors communicatively coupled to the memory;
a variable viewing angle interface module, stored in the memory and executable on the one or more processors, to present at least first and second viewing angles, a first viewing angle being perpendicular to a desktop object that is presented parallel to a surface of a display screen and the second viewing angle being non-perpendicular to the desktop object, the variable viewing angle interface module employing parallax to display one of the viewing angles.
9. The system of claim 8, wherein employing parallax comprises causing at least one rectangularly presented window object to be presented trapezoidally.
10. The system of claim 8, wherein employing parallax comprises shifting a rectangularly presented object into trapezoidally presented object.
11. The system of claim 8, the at least one angular dimension includes a viewing angle that is pivoted about a viewing axis that is substantially parallel to the desktop object and has a substantially vertical orientation with respect to the display screen.
12. The system of claim 8, further comprising a user-controlled viewing angle selector module, stored in the memory and executable on the one or more processors, to select between the first viewing angle and second viewing angle.
13. The system of claim 12, the user-controlled viewing angle selector module being configured to provide selection between the first and second viewing angles according to a position of an operating system-controlled cursor being within a predefined distance from a margin of the display screen.
14. The system of claim 8, transition from the first and second viewing angles effectuated by eye pupil motion detection of a user's eyes.
15. The system of claim 8, the first and second viewing angles being defined by a physical orientation of a line of sight directed from a user's eyes in relation to the display screen.
16. A method comprising:
providing a first viewing angle perpendicular to a desktop object that is presented parallel to a surface of a display screen; and
employing parallax to provide a second viewing angle being non-perpendicular to the desktop object.
17. The method of claim 16, further comprising displaying the first viewing angle and/or the second viewing angle based at least in part on a position of an operating system-controlled cursor.
18. The method of claim 16, further comprising displaying the first viewing angle and/or the second viewing angle based at least in part on eye pupil motion detection.
19. The method of claim 16, the first and second viewing angles corresponding to respective first and second viewpoints relative to the desktop object.
20. The method of claim 16, wherein providing the second viewing angle includes defining the second viewing angle as being pivoted from the first viewing angle about a vertical axis to provide a different lateral view with respect to the display screen.
US13/019,770 2002-03-29 2011-02-02 Peek Around User Interface Abandoned US20110138320A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/019,770 US20110138320A1 (en) 2002-03-29 2011-02-02 Peek Around User Interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/112,394 US7904826B2 (en) 2002-03-29 2002-03-29 Peek around user interface
US13/019,770 US20110138320A1 (en) 2002-03-29 2011-02-02 Peek Around User Interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/112,394 Continuation US7904826B2 (en) 2002-03-29 2002-03-29 Peek around user interface

Publications (1)

Publication Number Publication Date
US20110138320A1 true US20110138320A1 (en) 2011-06-09

Family

ID=28453321

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/112,394 Expired - Fee Related US7904826B2 (en) 2002-03-29 2002-03-29 Peek around user interface
US13/019,770 Abandoned US20110138320A1 (en) 2002-03-29 2011-02-02 Peek Around User Interface

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/112,394 Expired - Fee Related US7904826B2 (en) 2002-03-29 2002-03-29 Peek around user interface

Country Status (1)

Country Link
US (2) US7904826B2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD731524S1 (en) * 2011-10-27 2015-06-09 Htc Corporation Display screen with an animated graphical user interface
USD735236S1 (en) * 2012-12-26 2015-07-28 Successfactors, Inc. Portion of a display panel with an animated computer icon
USD736244S1 (en) * 2012-10-05 2015-08-11 Samsung Electronics Co., Ltd. Electronic device with a transitional graphical user interface
USD736809S1 (en) * 2013-02-23 2015-08-18 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737298S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737294S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737297S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737295S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737309S1 (en) * 2011-11-17 2015-08-25 Jtekt Corporation Control board device with graphical user interface
USD737838S1 (en) * 2013-02-23 2015-09-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737848S1 (en) * 2011-10-27 2015-09-01 Htc Corporation Display screen with an animated graphical user interface
USD737836S1 (en) * 2013-02-23 2015-09-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD738394S1 (en) * 2013-06-09 2015-09-08 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD765699S1 (en) 2015-06-06 2016-09-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD771656S1 (en) * 2010-01-27 2016-11-15 Apple Inc. Display screen or portion thereof with graphical user interface
USD776144S1 (en) * 2014-08-20 2017-01-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD791800S1 (en) * 2012-04-27 2017-07-11 Yahoo! Inc. Display screen with a graphical user interface displaying a content wheel
USD819680S1 (en) * 2012-12-18 2018-06-05 2236008 Ontario Inc. Display screen or portion thereof with a graphical user interface
USD836648S1 (en) 2014-09-03 2018-12-25 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD864236S1 (en) 2013-06-10 2019-10-22 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD877175S1 (en) 2018-06-04 2020-03-03 Apple Inc. Electronic device with graphical user interface
USD882621S1 (en) 2014-05-30 2020-04-28 Apple Inc. Display screen or portion thereof with graphical user interface
USD900833S1 (en) 2017-09-11 2020-11-03 Apple Inc. Electronic device with animated graphical user interface
USD914050S1 (en) 2017-06-04 2021-03-23 Apple Inc. Display screen or portion thereof with graphical user interface
USD942987S1 (en) 2013-12-18 2022-02-08 Apple Inc. Display screen or portion thereof with graphical user interface
USD999237S1 (en) 2018-10-29 2023-09-19 Apple Inc. Electronic device with graphical user interface

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7904826B2 (en) * 2002-03-29 2011-03-08 Microsoft Corporation Peek around user interface
US8091030B1 (en) * 2006-12-14 2012-01-03 Disney Enterprises, Inc. Method and apparatus of graphical object selection in a web browser
US9092053B2 (en) * 2008-06-17 2015-07-28 Apple Inc. Systems and methods for adjusting a display based on the user's position
US20120284668A1 (en) * 2011-05-06 2012-11-08 Htc Corporation Systems and methods for interface management
US9843665B2 (en) 2011-05-27 2017-12-12 Microsoft Technology Licensing, Llc Display of immersive and desktop shells
US10417018B2 (en) * 2011-05-27 2019-09-17 Microsoft Technology Licensing, Llc Navigation of immersive and desktop shells
US9645733B2 (en) 2011-12-06 2017-05-09 Google Inc. Mechanism for switching between document viewing windows
JP6404120B2 (en) * 2011-12-27 2018-10-10 インテル コーポレイション Full 3D interaction on mobile devices
US9696879B2 (en) 2012-09-07 2017-07-04 Google Inc. Tab scrubbing using navigation gestures
AP00554S1 (en) * 2013-02-23 2014-04-11 Samsung Electronics Co Ltd Display screen or portion thereof with animated graphical user interface
CH711334A2 (en) * 2015-07-15 2017-01-31 Cosson Patrick A method and apparatus for helping to understand an auditory sensory message by transforming it into a visual message.
US20170052683A1 (en) * 2015-08-20 2017-02-23 International Business Machines Corporation Altering a display perspective to avoid scrolling
US10503456B2 (en) * 2017-05-05 2019-12-10 Nvidia Corporation Method and apparatus for rendering perspective-correct images for a tilted multi-display environment

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4667236A (en) * 1985-04-26 1987-05-19 Digital Services Corporation Television perspective effects system
US5119189A (en) * 1989-10-25 1992-06-02 Hitachi, Ltd. Stereoscopic imaging system
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US5574836A (en) * 1996-01-22 1996-11-12 Broemmelsiek; Raymond M. Interactive display apparatus and method with viewer position compensation
US5808792A (en) * 1995-02-09 1998-09-15 Sharp Kabushiki Kaisha Autostereoscopic display and method of controlling an autostereoscopic display
US5949430A (en) * 1997-05-20 1999-09-07 Microsoft Corporation Peripheral lenses for simulating peripheral vision on a display device
US6084594A (en) * 1997-06-24 2000-07-04 Fujitsu Limited Image presentation apparatus
US6233004B1 (en) * 1994-04-19 2001-05-15 Canon Kabushiki Kaisha Image processing method and apparatus
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US6292593B1 (en) * 1996-09-18 2001-09-18 Sharp Kabushiki Kaisha Image synthesizing device and method
US6304286B1 (en) * 1995-06-09 2001-10-16 Pioneer Electronic Corporation Stereoscopic display apparatus for generating images in accordance with an observer's viewing position
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US20020018136A1 (en) * 1994-04-11 2002-02-14 Toshio Kaji Image processing apparatus
US20020047835A1 (en) * 2000-09-11 2002-04-25 Tomoaki Kawai Image display apparatus and method of displaying image data
US20020075384A1 (en) * 1997-11-21 2002-06-20 Dynamic Digital Depth Research Pty. Ltd. Eye tracking apparatus
US20020109680A1 (en) * 2000-02-14 2002-08-15 Julian Orbanes Method for viewing information in virtual space
US20020122014A1 (en) * 2001-03-02 2002-09-05 Rajasingham Arjuna Indraeswaran Intelligent eye
US20020135738A1 (en) * 2001-01-22 2002-09-26 Eastman Kodak Company Image display system with body position compensation
US20030012425A1 (en) * 1998-11-12 2003-01-16 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US20030053206A1 (en) * 2001-04-10 2003-03-20 Takayoshi Togino Display apparatus
US20030184576A1 (en) * 2002-03-29 2003-10-02 Vronay David P. Peek around user interface
US6938218B1 (en) * 2000-04-28 2005-08-30 James Nolen Method and apparatus for three dimensional internet and computer file interface

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69434685T2 (en) * 1993-08-04 2006-09-14 Canon K.K. Image processing method and apparatus
US5513276A (en) * 1994-06-02 1996-04-30 The Board Of Regents Of The University Of Oklahoma Apparatus and method for three-dimensional perspective imaging of objects
JP3478606B2 (en) * 1994-10-12 2003-12-15 キヤノン株式会社 Stereoscopic image display method and apparatus
US5838317A (en) * 1995-06-30 1998-11-17 Microsoft Corporation Method and apparatus for arranging displayed graphical representations on a computer interface
US6377230B1 (en) * 1995-10-05 2002-04-23 Semiconductor Energy Laboratory Co., Ltd. Three dimensional display unit and display method
US6118474A (en) * 1996-05-10 2000-09-12 The Trustees Of Columbia University In The City Of New York Omnidirectional imaging apparatus
US6198484B1 (en) * 1996-06-27 2001-03-06 Kabushiki Kaisha Toshiba Stereoscopic display system
JPH10198822A (en) * 1997-01-10 1998-07-31 Sharp Corp Image compositing device
JPH10334275A (en) * 1997-05-29 1998-12-18 Canon Inc Method and system for virtual reality and storage medium
US6195104B1 (en) * 1997-12-23 2001-02-27 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6064354A (en) * 1998-07-01 2000-05-16 Deluca; Michael Joseph Stereoscopic user interface method and apparatus
US6411292B1 (en) * 1999-03-31 2002-06-25 International Business Machines Corporation Display of pointing indicator within two-dimensional window display in three dimensions on a computer screen
US6803928B2 (en) * 2000-06-06 2004-10-12 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Extended virtual table: an optical extension for table-like projection systems
US6531999B1 (en) * 2000-07-13 2003-03-11 Koninklijke Philips Electronics N.V. Pointing direction calibration in video conferencing and other camera-based system applications
US6608615B1 (en) * 2000-09-19 2003-08-19 Intel Corporation Passive gaze-driven browsing
US20030080937A1 (en) * 2001-10-30 2003-05-01 Light John J. Displaying a virtual three-dimensional (3D) scene

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4667236A (en) * 1985-04-26 1987-05-19 Digital Services Corporation Television perspective effects system
US5119189A (en) * 1989-10-25 1992-06-02 Hitachi, Ltd. Stereoscopic imaging system
US20020018136A1 (en) * 1994-04-11 2002-02-14 Toshio Kaji Image processing apparatus
US6233004B1 (en) * 1994-04-19 2001-05-15 Canon Kabushiki Kaisha Image processing method and apparatus
US5500671A (en) * 1994-10-25 1996-03-19 At&T Corp. Video conference system and method of providing parallax correction and a sense of presence
US6327381B1 (en) * 1994-12-29 2001-12-04 Worldscape, Llc Image transformation and synthesis methods
US5808792A (en) * 1995-02-09 1998-09-15 Sharp Kabushiki Kaisha Autostereoscopic display and method of controlling an autostereoscopic display
US6304286B1 (en) * 1995-06-09 2001-10-16 Pioneer Electronic Corporation Stereoscopic display apparatus for generating images in accordance with an observer's viewing position
US5574836A (en) * 1996-01-22 1996-11-12 Broemmelsiek; Raymond M. Interactive display apparatus and method with viewer position compensation
US6292593B1 (en) * 1996-09-18 2001-09-18 Sharp Kabushiki Kaisha Image synthesizing device and method
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US5949430A (en) * 1997-05-20 1999-09-07 Microsoft Corporation Peripheral lenses for simulating peripheral vision on a display device
US6084594A (en) * 1997-06-24 2000-07-04 Fujitsu Limited Image presentation apparatus
US20020075384A1 (en) * 1997-11-21 2002-06-20 Dynamic Digital Depth Research Pty. Ltd. Eye tracking apparatus
US20030012425A1 (en) * 1998-11-12 2003-01-16 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US20020109680A1 (en) * 2000-02-14 2002-08-15 Julian Orbanes Method for viewing information in virtual space
US6938218B1 (en) * 2000-04-28 2005-08-30 James Nolen Method and apparatus for three dimensional internet and computer file interface
US20020047835A1 (en) * 2000-09-11 2002-04-25 Tomoaki Kawai Image display apparatus and method of displaying image data
US20020135738A1 (en) * 2001-01-22 2002-09-26 Eastman Kodak Company Image display system with body position compensation
US20020122014A1 (en) * 2001-03-02 2002-09-05 Rajasingham Arjuna Indraeswaran Intelligent eye
US20030053206A1 (en) * 2001-04-10 2003-03-20 Takayoshi Togino Display apparatus
US20030184576A1 (en) * 2002-03-29 2003-10-02 Vronay David P. Peek around user interface
US7904826B2 (en) * 2002-03-29 2011-03-08 Microsoft Corporation Peek around user interface

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD771656S1 (en) * 2010-01-27 2016-11-15 Apple Inc. Display screen or portion thereof with graphical user interface
USD737848S1 (en) * 2011-10-27 2015-09-01 Htc Corporation Display screen with an animated graphical user interface
USD731524S1 (en) * 2011-10-27 2015-06-09 Htc Corporation Display screen with an animated graphical user interface
USD741351S1 (en) * 2011-11-17 2015-10-20 Jtekt Corporation Control board device with graphical user interface
USD737309S1 (en) * 2011-11-17 2015-08-25 Jtekt Corporation Control board device with graphical user interface
USD791800S1 (en) * 2012-04-27 2017-07-11 Yahoo! Inc. Display screen with a graphical user interface displaying a content wheel
USD736244S1 (en) * 2012-10-05 2015-08-11 Samsung Electronics Co., Ltd. Electronic device with a transitional graphical user interface
USD819680S1 (en) * 2012-12-18 2018-06-05 2236008 Ontario Inc. Display screen or portion thereof with a graphical user interface
USD735236S1 (en) * 2012-12-26 2015-07-28 Successfactors, Inc. Portion of a display panel with an animated computer icon
USD736809S1 (en) * 2013-02-23 2015-08-18 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737298S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737836S1 (en) * 2013-02-23 2015-09-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737297S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737838S1 (en) * 2013-02-23 2015-09-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737295S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD737294S1 (en) * 2013-02-23 2015-08-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD775147S1 (en) 2013-06-09 2016-12-27 Apple Inc. Display screen or portion thereof with graphical user interface
USD860233S1 (en) 2013-06-09 2019-09-17 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD956061S1 (en) 2013-06-09 2022-06-28 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD738394S1 (en) * 2013-06-09 2015-09-08 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD808401S1 (en) 2013-06-09 2018-01-23 Apple Inc. Display screen or portion thereof with graphical user interface
USD789969S1 (en) 2013-06-09 2017-06-20 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD864236S1 (en) 2013-06-10 2019-10-22 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD1012103S1 (en) 2013-12-18 2024-01-23 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD942987S1 (en) 2013-12-18 2022-02-08 Apple Inc. Display screen or portion thereof with graphical user interface
USD892155S1 (en) 2014-05-30 2020-08-04 Apple Inc. Display screen or portion thereof with graphical user interface
USD882621S1 (en) 2014-05-30 2020-04-28 Apple Inc. Display screen or portion thereof with graphical user interface
USD776144S1 (en) * 2014-08-20 2017-01-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD836648S1 (en) 2014-09-03 2018-12-25 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD789396S1 (en) 2015-06-06 2017-06-13 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD783668S1 (en) 2015-06-06 2017-04-11 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD877769S1 (en) 2015-06-06 2020-03-10 Apple Inc. Display screen or portion thereof with graphical user interface
USD863342S1 (en) 2015-06-06 2019-10-15 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD888756S1 (en) 2015-06-06 2020-06-30 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD789960S1 (en) 2015-06-06 2017-06-20 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD765699S1 (en) 2015-06-06 2016-09-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD784398S1 (en) 2015-06-06 2017-04-18 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD914050S1 (en) 2017-06-04 2021-03-23 Apple Inc. Display screen or portion thereof with graphical user interface
USD956088S1 (en) 2017-09-11 2022-06-28 Apple Inc. Electronic device with animated graphical user interface
USD900833S1 (en) 2017-09-11 2020-11-03 Apple Inc. Electronic device with animated graphical user interface
USD877175S1 (en) 2018-06-04 2020-03-03 Apple Inc. Electronic device with graphical user interface
USD962269S1 (en) 2018-06-04 2022-08-30 Apple Inc. Electronic device with animated graphical user interface
USD999237S1 (en) 2018-10-29 2023-09-19 Apple Inc. Electronic device with graphical user interface

Also Published As

Publication number Publication date
US20030184576A1 (en) 2003-10-02
US7904826B2 (en) 2011-03-08

Similar Documents

Publication Publication Date Title
US7904826B2 (en) Peek around user interface
US8416266B2 (en) Interacting with detail-in-context presentations
US7480873B2 (en) Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model
US6016145A (en) Method and system for transforming the geometrical shape of a display window for a computer system
US7043701B2 (en) Opacity desktop with depth perception
EP1564632B1 (en) Systems and methods that utilize a dynamic digital zooming interface in connection with digital inking
US6943811B2 (en) Apparatus and method of managing data objects
US8711183B2 (en) Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
EP0694829B1 (en) A method and apparatus for visualization of database search results
US20060082901A1 (en) Interacting with detail-in-context presentations
US9324188B1 (en) Manipulation of 3D medical objects
US7486302B2 (en) Fisheye lens graphical user interfaces
US20050204306A1 (en) Enhancements for manipulating two-dimensional windows within a three-dimensional display model
US20210294463A1 (en) Techniques to Modify Content and View Content on Mobile Devices
US20070097150A1 (en) Viewport panning feedback system
US20040100479A1 (en) Portable information terminal, display control device, display control method, and computer readable program therefor
US20070198941A1 (en) Graphical user interface with zoom for detail-in-context presentations
JP2009537903A (en) User interface system and method for selectively displaying a portion of a display screen
JP2002140147A (en) Graphical user interface
WO2002091153A2 (en) Graphical user interface for detail-in-context presentations
EP2401720A1 (en) Displaying bar charts with a fish-eye distortion effect
US6018333A (en) Method and apparatus for selection and manipulation of an overlapping graphical element on a display
WO2018198703A1 (en) Display device
JP4907156B2 (en) Three-dimensional pointing method, three-dimensional pointing device, and three-dimensional pointing program
CN115485734A (en) Interface method and device for drawing three-dimensional sketch

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION