WO2000046680A1 - Novel method and apparatus for controlling video programming - Google Patents

Novel method and apparatus for controlling video programming Download PDF

Info

Publication number
WO2000046680A1
WO2000046680A1 PCT/US2000/002870 US0002870W WO0046680A1 WO 2000046680 A1 WO2000046680 A1 WO 2000046680A1 US 0002870 W US0002870 W US 0002870W WO 0046680 A1 WO0046680 A1 WO 0046680A1
Authority
WO
WIPO (PCT)
Prior art keywords
control device
remote control
images
motion
image
Prior art date
Application number
PCT/US2000/002870
Other languages
French (fr)
Inventor
Yakov Kamen
Leon Shirman
Original Assignee
Isurftv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/344,442 external-priority patent/US6342884B1/en
Application filed by Isurftv filed Critical Isurftv
Publication of WO2000046680A1 publication Critical patent/WO2000046680A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42222Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Definitions

  • Another type of option selection scheme is to provide a set of small pictures on a television screen, and permit a user to "click" on one of the pictures to select an option corresponding to that picture. Such small pictures are sometimes called “thumbnails.” Unfortunately, one can only put so many thumbnail pictures on a screen due to limited resolution of the television screen. l ll is an object of our invention to provide an improved method and apparatus foi
  • the screen of the display device contains a primary
  • the plurality of geometric surfaces are arranged as a polyhedron
  • the main video stream is mapped onto a flat geometric
  • control device comprises a Irack ball By rotating the track ball, one can rotate one or
  • video streams are mapped onto the
  • the polyhedron can rotate aboul only 2 ) one axis In another embodiment, it can rotate about more than one axis ) A control
  • the remote control device e g. a button or switch
  • the remote control device can be used to select which 1 image on a polyhedron face is to be shown on the primary portion o the display device
  • the hand-held remote control device is held by a user who
  • t can rotate the i emote conlrol device, e.g. about any desired axis Means are provided
  • a icceiviug circuit within the television causes the image o the polyhedron to rotate or R move m a manner thai mirrors the motion ofthe remote controller
  • a face o the y polyhedron depicting an image representing a desired option is facing the user, he can to actuate a button or other' control device on the remote controller to select that option
  • FIG. 1 7 Figs 1 A to 1 P. illustrate the operation of a 3D graphics pipeline
  • Figs 2A and 2B illustrate manipulation of a 21) image l ⁇ > f ig 3 is a simplified block diagram of a personal computer (PC) coupled to a 0 graphics controller with a 3D graphics pipeline.
  • PC personal computer
  • FIG. 1 fig 4 illustrates a television displaying an image of polyhedron constructed in 2 accordance with oui invention.
  • I Fig ⁇ illustrates a remote conlrol device for controlling the television comprising
  • Fig 6 illustrates a remote control device for controlling the television comprising
  • a H ack ball s Fig 7 illustiatcs a remote control device for contr lling the television comprising
  • Fig 8 illustrates a television displaying a band of images
  • the polyhedron and the images on the faces ofthe polyhedron are generated f> using a 3D graphics pipeline in a novel mannei In order to explain the mannei in which
  • Die 3D graphics pipeline referred to in this patent can be implemented by a . combination of hardware elements, known as accelerators, and software, some of which 4 is sometimes referred to as drivers.
  • the partitioning between hardware and software may 1 vary, depending upon the CPU used and the graphics card in the system, but the overall
  • the primitive can be a polygon (c.g.
  • a 3D graphics pipeline constructs a 3D image of an object
  • Fig. I A illustrates a 2D image
  • a house Image 2 includes a portion 2a, which has the
  • portion 2b which has the appearance of roof shingles, portion 2c,
  • portion 2d which has the appearance of a
  • image 2 is slored in a digital memory in the form of an array of pixels
  • Each location in the memory stores a pixel, which is one or more words of data indicating 12 the color, color saturation and brightness corresponding to that pixel ' l ' he location of Ihc 1 .
  • pixels within the array is typically referred to as u, v coordinates (not to be confused with
  • the pixel array is an n by m array, where n and m are integers )
  • Fig I A represents a pixel array Physically, the array
  • the geometric surface is in the form of a mesh 4 of primitives 5 in three 1 dimensional space (Fig. I B)
  • Fig. I B the primitives are triangles, but other
  • the mesh of primitives represents a Häe-dimcnsional 3 shape of an object O in 3D space (in the case of Fig. I B, the shape of a house)
  • Th 1 position of each vertex of each triangle within mesh 4 is stored in a memory in the form
  • model coordinates The process of preparing such a i mesh is well -known, and described in standard graphics libraries, such as Real 3D,
  • a graphics pipeline can, and often does, map one or several texture maps onto the same or several different objects 1
  • the next step in the process is to set up a "world coordinate model" of the var ious
  • the computer screen will provide an image of tetrahedron T and cube C as
  • This image will be provided initially as a pixel array in a
  • the pipeline a) fetches the portion of texture map 2 "tacked" to the
  • the 3D graphics accelerator permits one to manipulate the displayed M objects in any desii ed manner For example, if one wants to rotate the image of l ⁇ tetrahedron T by 45° (Fig I E), the 3D graphics accelerator facilitates this manipulalion
  • the regenerated image will reflect this rotation olJetrahedion T 0 Similarly, suppose that il is desired to display what would appear to the vicwei if 1 he look ten steps forward from his location at position P The next time the graphics 2 pipeline regenerates Ihc image, it will generate and store another pixel array in the frame 1 buffei corresponding to what would appear to such a viewer, and this pixel array is
  • one embodiment ofthe invention is a remote controller
  • a method for manipulating a two-dimensional image begins with the step of 2.2 obtaining a two-dimensional digital image (e.g image 10 in Fig 2A)
  • This step can be 23 performed, e g , by scanning an image such as a photograph or othei picture using a 1 convent ional digital scanner
  • the digital image can also be obtained from a conventional
  • the image can also consist of digital video image, e g out of a live oi
  • the digital image is typically stored in a
  • the digital values are in a
  • Any type of memory can be used to store the digital 2D image, e.g.
  • I semiconductor memor ies SRAMs, DR AMs or other semiconductor memories
  • magnet ic memory e g a hard disk, a floppy disk, magnetic tape, or magncto-oplic disk
  • 18 stoi ed image can be stored in terms of RGB values (e.g the strength of the red, gi ecn and
  • YUV values or other values (For YUV values, Y 0 corresponds to the amplitude or brightness of the pixel value, U corresponds to the color
  • the computer model can comprise an implicit description o flat
  • planar geometric surface comprises a mesh
  • i mesh 12 comprises about 5000 triangles, which would be acceptable for processing a
  • image 10 is
  • each control poinl is associated with a texture 2 coordinates (u, v) corresponding lo a pixel
  • binding See Ka en, I I Fill . Computer Sociely, 1EHJ . Computer Graphics and Applications, Jan -Fob 1 97,
  • control points within the implicitly defined surface are bound to pixel
  • live video will appear as live video 3 Thai is because every lime a new pixel array is generaled, Ihe texture map, which I contains the incoming video frame buffer, is reread and put thr ough the pipeline Since
  • the lexlure mapping pr ocess also contains features for pixel interpolation, an automat ic
  • the mesh 10 around a cylinder (Of course, the mesh 10 can be altei ed into other shapes, and the
  • CPi I such as a Celeron or Pentium., e g. as manufactured by IrHel, or a K6
  • T processor e g as manufactured by Advanced Micro Devices 4 2 32 MB of memory or greater 1 1 A 3D I IW adaptei This is a type of graphics card currently available on the
  • the 3D 1 IW adapter should have 4 MB of memory (preferably 8 MB)
  • AGP interface is a type of
  • PCT peripheral connection s interlace
  • the operating system can be Windows 95, Windows 98, W ⁇ n2000, oi any other
  • 1 1 includes a standardized platform called Direct X for Windows
  • a user sets up the flat geometric surface (for example, a
  • the PC comprises a bypass mechanism that permits one to access
  • Fig 3 is a block diagram of a computer system 50 for generating images that can
  • system 50 comprises a 0 CPU 52, e g a Pentium II class CPU, comprising a cache memory 52a, a core 52b arid an 1 internal bus 52c for facilitating communication between core 52b and cache 52a Core
  • System memory 58 1 includes a fit si portion 58a which stores system mernoi y programs and a second portion
  • PCI bus 62 for facilitating communication
  • I/O device 64 can be any type of I/O device In one embodiment, I/O device 64 is a network
  • I/O device 64 can be a modem, and in others I/O
  • 10 device 64 is a tuner for receiving television signals etc
  • I/O device 66 is a video capture card with a driver Data
  • the video capture card is either loaded by DMA (direct memoiy access) or CPU 52
  • the frame buffer may be
  • 1 s image sources are available, such as local storage, capture card 66, N1 ⁇ 64 or other, not
  • System 50 also includes an AGP graphics controller 70 comprising a 3D
  • AGP graphics controller 70 communicates with system
  • AGP giaphics controller 19 controller 56 via an AGP bus 72
  • AGP giaphics controller 19 controller 56 via an AGP bus 72
  • 0 70 can communicate wilh system controller 56 via PCI bus 62 (e.g. as shown in phanlom 1 in Fig 3)
  • Graphics controller 70 uses its own local memory 74 to generate and store pixel . arrays l be displayed on a video display unit 76 1 It is emphasized that system 50 is only one example o system that peifo ⁇ ns a
  • the above-mentioned method can be used to manipulate image streams such as
  • NTSC is an
  • the system of Fig 3 can move and till portions of different video images or olhei K) images onto different portions of a screen such as a television screen
  • a screen such as a television screen
  • Ihc images are transformed to appear on the faces of a polyhedron (e g a
  • the polyhedron is used as a new type of television menu ⁇ option display
  • a novel remote control device (described below) permits a
  • the manipulated image provided in accordance with our invention can be t9 pi ovided to any appropriate output device, e g a television screen, a video projector, a
  • 21 inve ⁇ lion could come from any of a number of sources, e.g an analog or digital video 2 input, a cable TV input, a satellite input, the internet, a digital scanner, a digital camera, 1 or numerous other sources (In the case of an analog input, one would fir st digitize the
  • Fig 4 illustrates a television 200 in which the television screen 201 is divided into
  • Primary portion 202 displays a
  • Secondary portion 203 depicts a polyhedron 204 in accordance
  • Television 200 includes a controller 205 for generating an image of
  • This controller can include the hardware elements depicted in Fig 3
  • the faces of polyhedron 204 can depict video images, e g images of what appears on the
  • a given face of polyhedron 204 can also include both video
  • i s depict icons concerning various options, e.g. options related to television volume, on/off
  • 19 Television 200 is controlled by r emote control device 206, which communicates with
  • a signal e g. an infrared, radio or other type of signal thai can 1 be transmitted and received
  • a signal e g. an infrared, radio or other type of signal thai
  • Remote conlrol devices that communicate with a television 2 using infrared signals are well known in the art. See U.S Patent 4,91 ,439, for example )
  • remote control device 206 can 4 be connected to and communicate with television 200 by a wire 1
  • remote control device 206 has the capability of sensing
  • controller 205 A 3D pipeline within controller 205 orients polyhedron 204 in a manner
  • the signal depicting motion of remote control device 206 can be filtcied to
  • FIG. 8 illustrates several features of remote control device 206 Element 21 1 is a
  • Element 21 1 is typically an IR
  • 1 1 transmitter but it could also be an ultrasonic, radio, magnetic induction, or other type of
  • n emittci may be used in order to guarantee proper communication while handling remote 14 control device 206
  • buttons and displays e g light emitting diodes oi
  • a battery 220 Also shown in phantom are a battery 220, a printed circuit board 221 (containing a
  • microcontroller 222 to determine which way device 206 is turned. In response to such 3 motion, microcontroller communicates to controller 205 the manner in which remote 1 control device 206 has been manipulated Controller 205 alters the image of polyhedron
  • remote controller 206 appropriately The user of remote controller 206 selects an option corresponding to
  • controller 205 s remote control device 206, and provides appropriate signals to controller 205 in response
  • remote control device 206 In this way a user can select a television channel, increase or
  • 16 can be used, e g gyroscopes, GPS (global positioning system), or other inertia or position
  • thai remote conlrol device 206 is different from other types of remote
  • Remote control device 206 can sense its
  • remote conlrol device 206 does not 1 i cqiiiic a stationary non-moving component to determine the motion of remote conlrol
  • Remote control device 206 is a preferred structure for manipulating polyhedron
  • Fig 6 illustrates a remote control device 240 comprising a
  • controller 205 within television 200, which causes polyhedron 204 to rotate
  • a remote control device 244 which comprises a 0 rotating wheel 246 (Fig 7) During use, an operator rotalcs wheel 246, which is sensed 1 by remote control device 244 A signal is thus communicated by device 240 to conlroller 2 () wilhrn television 200, which causes polyhedron 204 to rotate.
  • video streams are bound to the various 4 geometric surfaces forming the polyhedron. The polyhedron is rotated by altering the
  • CPl ' 52 6 within cont toller 205 determines what change is to be made to the world coordinate 7 system to in response to the signal controller 205 receives from remote control device 8 206 (or remote conlrol device 240 or 244 as the case may be) 9
  • the image on the primary portion 202 of television screen 201 0 is also bound lo a geometric surface by a 3D graphics pipeline One can rotate or 1 manipulate ihe image on primary portion 202 of the television screen using the remote 2 control device 1 Although the geometric surfaces in second portion 203 of the television screen
  • a band of images 204' is provided in second portion 203 of
  • I -4 point to a particular image within band 204', and then actuate another control button to is select thai image (or the menu option represented by that image) Alternatively, one of
  • the posilions along band 204' can be highlighted or otherwise marked as representing an
  • Band 204' can be either a "closed band” or an "open band "
  • Closed band” 1 0 mean a band whereby scrolling band 204' far enough in one direction (e.g rotating the 1 band 60 degrees to the right) will eventually result in the same images being returned to 2 their original position This is to be contrasted with an open band of images in which ) moving ihe images to the right, for example, will eventually expose a left -most image,
  • a plurality of polyhedra or bands can be depicted and manipulated on a television
  • Each face of the polyhedra or bands can include two or more portions thai can

Abstract

A display screen (201) provides an image of a set of surfaces, e.g. a polyhedron (204), each one of the surfaces depicting a menu option. A remote control device is provided for providing an input signal to the television (201), which responds to the input signal by manipulating the orientation of the surfaces, and exposing the various menu options available to the user. The user can then click on the desired face of the polyhedron, corresponding to the desired option. In one embodiment, the remote control device comprises sensing means for detecting the motion and/or position of the remote control device. The polyhedron moves in a manner that tracks motion of the remote control device.

Description

NOVEL METHOD AND APPARATUS FOR CONTROLLING VIDEO PROGRAMMING
Cross-Reference to Related Applications This patent claims priority based on U.S. Patent Application 09/344,442, filed June 25, 1999; 09/378,184, filed August 20, 1999; 09/378,270, filed August 20, 1999; and 60/118,505, filed February 3, 2000, each incorporated herein by reference in its entirety. Background of the Invention This invention pertains to remote control devices for controlling a television. There are numerous types of remote control devices used for controlling a television. One type of remote control device comprises a numeric keypad for punching in the number of a desired television channel, as well as buttons for selecting other options. Other types of remote control devices move a cursor on a screen to select a desired option. These techniques do not usually permit a viewer to preview a channel option before that option is selected. Another type of option selection scheme is to provide a set of small pictures on a television screen, and permit a user to "click" on one of the pictures to select an option corresponding to that picture. Such small pictures are sometimes called "thumbnails." Unfortunately, one can only put so many thumbnail pictures on a screen due to limited resolution of the television screen. l ll is an object of our invention to provide an improved method and apparatus foi
? selecting options for controlling an image display device, e.g a television, computer
3 screen, video editing device, or other type of device compi ising an image display
s Summary
6 Λ method in accordance with one aspect of our invention comprises the step of
7 displaying multiple video streams on a display device such as a computer monitor or a H television In one embodiment, the screen of the display device contains a primary
9 poilion and a secondary portion Λ first one of the multiple video streams (hereafter the
10 "main video si ream") is displayed on the primary portion of the screen The second
1 J portion of the stream displays an image containing a plurality of geometric surfaces In
12 one embodiment, the plurality of geometric surfaces are arranged as a polyhedron At
1 least one of the muhiple video streams is mapped onto al least one of the faces of the
14 polyhedron Typically, several of the multiple video streams are mapped onto associated ] > ones of the faces ofthe polyhedron Alternatively, other faces of the polyhedron display
16 images such as icons corresponding to an option that an operator can exercise, e g
17 turning the volume ofa television up or down, changing a channel, or performing various
18 video editing functions
19 In accordance with another aspect of oui invention, a hand-held remote control
20 device permits a user to manipulate and/or select the video images mapped onto the
21 geometric surfaces In one embodiment, the position in which the rernole control device
22 is held is associated with ihe position of the geometric surfaces in the world coordinate
23 system. (As explained below, the term "world coordinate system" pertains to the 1 orientation of an image displayed on a display screen ) Thus, by rotating the remote
2 control device, one can rotate those geometric surfaces
3 In one embodiment, the main video stream is mapped onto a flat geometric
. surface By rotating the remote control device, one can also rotate the Hat geometric
s sin face upon which the main, video stream is mapped
6 In accordance with another embodiment ofthe invention, the hand-held i emote
7 coiiliol device has a scrolling wheel By rotating the wheel, one can rotate one or nioie
8 of the geometric surfaces upon which images are mapped In one embodiment in which the plurality of geometric surfaces form a polyhedron, by rotating the wheel, one can
10 iotate the polyhedron By rotating the wheel, one can also rotate the surface upon which
I ! the main video stream is mapped
I? In accordance with another embodiment ofthe invention, the had-held remote
1 control device comprises a Irack ball By rotating the track ball, one can rotate one or
14 more of the geomelric surfaces upon which images are mapped In one embodiment on
r. which the plurality of geometric surfaces form a polyhedron, by moving the track ball,
16 one can rotate the polyhedron By moving the track ball, one can also rotate the surface
17 upon which the main video stream is mapped is As mentioned above, in one embodiment, video streams are mapped onto the
1 various faces o a polyhedron, and rotating the hand-held remote control device lesulls in
20 l otalion o the polyhedron (In one embodiment, the polyhedron can rotate aboul only 2) one axis In another embodiment, it can rotate about more than one axis ) A control
22 clement on the remote control device, e g. a button or switch, can be used to select which 1 image on a polyhedron face is to be shown on the primary portion o the display device
2 as the main video stream.
. In one embodiment, the hand-held remote control device is held by a user who
t can rotate the i emote conlrol device, e.g. about any desired axis Means are provided
5 within the remote conlrol device for sensing motion and/or position of the remote control
6 device, and communicating to a receiver within the television that motion and/or position
7 A icceiviug circuit within the television causes the image o the polyhedron to rotate or R move m a manner thai mirrors the motion ofthe remote controller When a face o the y polyhedron depicting an image representing a desired option is facing the user, he can to actuate a button or other' control device on the remote controller to select that option
1 1 In another embodiment, instead of displaying a polyhedron, the menu options
12 displayed on the video screen can be displayed in another form. However, different l . menu options can be displayed and/or selected in response to the motion and/or position
hi of the remote control device
1
16 Brief Description ofthe Djay gs
1 7 Figs 1 A to 1 P. illustrate the operation ofa 3D graphics pipeline
18 Figs 2A and 2B illustrate manipulation of a 21) image l<> f ig 3 is a simplified block diagram of a personal computer (PC) coupled to a 0 graphics controller with a 3D graphics pipeline.
1 fig 4 illustrates a television displaying an image of polyhedron constructed in 2 accordance with oui invention. I Fig ~ illustrates a remote conlrol device for controlling the television comprising
) position 01 motion sensors
3 Fig 6 illustrates a remote control device for controlling the television comprising
4 a H ack ball s Fig 7 illustiatcs a remote control device for contr lling the television comprising
6 a rotating wheel
7 Fig 8 illustrates a television displaying a band of images
8
9 Detailed Description lo As mentioned above, a method in accordance with our invention involves
I I displaying an image of polyhedron on a television or other display device Each face of 12 the polyhedron depicts an image representing an option that can be 1aken by someone l '! operating the television
1 . The polyhedron and the images on the faces ofthe polyhedron are generated f> using a 3D graphics pipeline in a novel mannei In order to explain the mannei in which
16 the polyhedion and images are generated, we will first explain how a 3D graphics
17 pipeline is normally used. We will then describe its use during a clhod in accordance IK with the invention We will then describe remole control devices that can be used to 19 manipulate the orientation o the polyhedron 0 1 3D Graphics Pipelines
? Die 3D graphics pipeline referred to in this patent can be implemented by a . combination of hardware elements, known as accelerators, and software, some of which 4 is sometimes referred to as drivers. The partitioning between hardware and software may 1 vary, depending upon the CPU used and the graphics card in the system, but the overall
2 system performs (he method sleps described below Portions of the pipeline tasks can be
3 performed by software, which is less expensive than hardware, but in general slower ihan I hardware solutions at the present time The hardware and software that perform the steps described below are referred to simply as a pipeline, without regard to the specific
6 partitioning
7 I'he following is a simplified, general description of 3D graphics pipelines It is
8 I not intended to describe any specific product (e.g. products mentioned later in this
9 \)S alent) Rather, the following description is merely a general explanation of 3D graphics in pipelines to assist the reader's understanding.
1 1 Currently, graphics objects created using a 3D graphics pipeline can be described
12 as a set of geometr ic surfaces One way of constructing a geometric surface in a graphics l . pipeline is to create a "mesh" of "primitives," A "primitive" is a small geomdric suiface
14 that can be defined by a set of vertices For example, the primitive can be a polygon (c.g
15 a triangle or quadrilateral) defined within the pipeline in terms ofthe locations (in x, y
16 and / coordinate space) of its corners or vertices. A set of several primitives is used to
17 define a larger 3D surface.
18 Instead of using primitives, such as polygons, some graphics pipelines can
1 process geometric surface areas defined in other ways, e.g. by mathematical equations 0 This technique foi defining geometric surface areas is called "implicit " As explained t below, both techniques for defining such surface areas can be used l For purposes of clarity of explanation, we will (rest describe a graphics pipeline
) that processes geometric surface areas using triangular primitives Othei types of
I graphics pipelines will be discussed later on.
4 In this first example, a 3D graphics pipeline constructs a 3D image of an object
5 from a 2D pixel array (typically called a "texture map") Fig. I A illustrates a 2D image 2
6 ofa et of "textures " (As will be explained below, this texture map is used to create the
7 image of an object in this case, a house Image 2 includes a portion 2a, which has the
8 appear nce of br icks, portion 2b, which has the appearance of roof shingles, portion 2c,
9 which has the appeal ance of a door, and portion 2d which has the appearance of a
10 window ) 21) image 2 is slored in a digital memory in the form of an array of pixels
I I Each location in the memory stores a pixel, which is one or more words of data indicating 12 the color, color saturation and brightness corresponding to that pixel 'l'he location of Ihc 1 . pixels within the array is typically referred to as u, v coordinates (not to be confused with
1 1 the Y, U and V signal names used to describe certain video signals) (The u, v
1 cooi dinates ai e similar to x, y coordinates of the Cartesian coordinate system In Fig 1 A,
16 the pixel array is an n by m array, where n and m are integers )
17 As mentioned above, Fig I A represents a pixel array Physically, the array
18 comprises data loaded into a memory
19 The next step in the process is to provide or prepare a geometric surface. In this 0 example, the geometric surface is in the form ofa mesh 4 of primitives 5 in three 1 dimensional space (Fig. I B) In the case of Fig 1 B, the primitives are triangles, but other
2 types of polygons can be used The mesh of primitives represents a ihree-dimcnsional 3 shape of an object O in 3D space (in the case of Fig. I B, the shape of a house) Th 1 position of each vertex of each triangle within mesh 4 is stored in a memory in the form
2 of x, y and Cai tesian coordinates, relative to the object These coordinates are
. so climes referred to as model coordinates ("MC") The process of preparing such a i mesh is well -known, and described in standard graphics libraries, such as Real 3D,
5 published by Real 3D, a Lockheed Martin Corporation, in 1996, and Direct 3D, published
6 by New Riders Publishing in 1997
7 I hc mesh of Fig 1 is not displayed as such Rather, the mesh of Fig I B is a
8 lepi esentalion ot what is stoi ed in a digital memory Specifically, the memory sloi es the
9 locations, in terms of x, y and z coordinates, of each vertex within mesh 4
10 The next step is to map or "bind" the two-dimensional texture map of Fig I A
1 1 onto mesh 4 of Fig I B This is accomplished by mapping each triangle vertex to a
1 ? locat ion in the texture map In effect, a list of data points is prepared that associates each
1 vertex of mesh 4 io the u, v coordinates of a particular poi t (pixel) in the texture map of
l-i Fig 1 A (The locat ions in the texture map to which the vertices are bound are so etimes
I s refer red to as "conlrol points ")
16 This portion ofthe process is roughly analogous to an upholsterer choosing a
17 piece of fabi ic, and binding it with a few nails to the corner of a couch being upholstered
18 (Ihc nails a e like control poinls) The upholsterer subsequently asks his apprentice to
1 finish attaching the fabr ic to the couch In this case, the 3D graphics pipeline finishes the 0 task instead of an apprentice
1 figs 1 A and I describe the process by which one texture map (Fig I A) is
2 mapped onlo one mesh 4 representing one object O A graphics pipeline can, and often does, map one or several texture maps onto the same or several different objects 1 The next step in the process is to set up a "world coordinate model" of the var ious
2 objects to be displayed This requires establishing a position and diieclional orientation I i each object to be displayed For example, supposing ιhat there are to be two objects t to be viewed a tetrahedron T and a cube C (Fig I ) During this step ofthe process the ϊ pipeline is instructed that cube C is to be facing in a certain direction, and is to be located
6 partially in front of tetrahedron T relative to a certain frame of reference Again, the
7 structure of Fig IC is not displayed per se Rather, the graphics pipeline sets up
8 pr ocessing ofthe model coordinates in accordance with the parameters of the position
9 and orientation of the object ιo 'I he next step is to select a frame of reference For example, it might be decided
1 1 lhat the "viewer" will want to observe the objects from a position corresponding to a
12 corner o the world coordinate model (e g. position P in Fig ID) Thus, a virtual
11 viewpoint, viewing direction and aperture will be selected The parameters associated l'i with this "vicwei " define the screen coordinate (SC) system. Furthei, it might be decided
1 s the viewei will observe these objects with a light source located at a position I . The
16 graphics pipeline will set up another processing pipe to piocess the world coordinate data
17 mlo the sci een coordinate data which will cause a computer screen to display the image
18 as it would be pciceived by the observer at position P (e g the image of Fig I D) In
19 othei words, the computer screen will provide an image of tetrahedron T and cube C as
20 they would be observed by a viewer if he were standing at position P, and a light source
21 weie present al location I, This image will be provided initially as a pixel array in a
V fiauie buffer and then displayed by the computer- screen The image in the frame buffer
2. is i efi eshed, i e. regenerated according to the specifications programmed into the 1 pipeline, typically at about 50 to 120 times per second There are many different
2 methods for optimizing the pipeline, and minimizing the lime spent processing the . invisible parts of the objects, such as the backside of cube C facing away from the
4 viewer Such details are well-known to those skilled in the art, and will not be discussed
3 in detail heie
6 During the above-described piocess constructing the pixel array and providing ii
7 in the frame buffer, the pipeline a) fetches the portion of texture map 2 "tacked" to the
8 vertices of mesh 4 (and therefore stretched over each triangle); b) determines how and
9 whe- c that portion of the texture map should appear, given the orientation o the triangles
10 relative lo (he viewer and the location ofthe light source, and c) constructs the
11 appropriate bit map pixel array for storage in the frame buffei The contents of this frame
12 buffer are then displayed as an image on a computer screen.
13 Thereafter, the 3D graphics accelerator permits one to manipulate the displayed M objects in any desii ed manner For example, if one wants to rotate the image of lή tetrahedron T by 45° (Fig I E), the 3D graphics accelerator facilitates this manipulalion
16 This is accomplished by providing a new set of parameters in the world coordinate model
17 foi the graphics pipeline indicating the new position and orientation for tetrahedron T
18 After this occurs, the next time the graphics pipeline regenerates the image stored in the
19 frame buffer, the regenerated image will reflect this rotation olJetrahedion T 0 Similarly, suppose that il is desired to display what would appear to the vicwei if 1 he look ten steps forward from his location at position P The next time the graphics 2 pipeline regenerates Ihc image, it will generate and store another pixel array in the frame 1 buffei corresponding to what would appear to such a viewer, and this pixel array is
2 provided as another image on the computer screen,
3 It is thus seen that the graphics pipeline is extremely useful in applications such as
4 video games, where it is desired to simulate what would appear' to a game player if he ϊ weie wandering pasl a set of objects
6 As mentioned above, some graphics pipelines cieate models of geometric surfaces
/ using an implicit technique These surfaces are often described as a function of ihe
8 position cooidinates, i c f (x,y,/-)7 or can also contain some vertices Control poinls and
9 additional for mulas associated with such surfaces are used to bind a digital pixel an ay
10 (e an array as shown in Fig 1 A) to the implicitly defined surface, and the process
1 1 proceeds as desci ibed above The major difference is that instead of defining sur ace
12 areas in terms of primitives with vertices, the surface areas are defined in terms of 1 1 mathematical equations
14 i s Manipulation of 2D Images
16 As mentioned above, one embodiment ofthe invention is a remote controller
17 which coopeiates with video graphics circuitry that provides an image of a polyhedron
18 each face of the polyhedion displaying an image corresponding to a menu option I will
1 now explain how that image is provided In particular, I will describe a method for
20 manipulating a two-dimensional image
21 A method for manipulating a two-dimensional image begins with the step of 2.2 obtaining a two-dimensional digital image (e.g image 10 in Fig 2A) This step can be 23 performed, e g , by scanning an image such as a photograph or othei picture using a 1 convent ional digital scanner The digital image can also be obtained from a conventional
2 digital camera The image can also consist of digital video image, e g out of a live oi
3 stored video slrearn, which is basically a fast succession of 2D images I lowever, any
4 other source of 2D digital image can be used The digital image is typically stored in a
s memoiy as an airay of digital values In one embodiment, the digital values are in a
6 compressed form, e.g using a compression technique such as MPF- I or MPFX-2 or
7 other formats In the case of compressed digital values., they must first be decompi essed
8 pi ioi to processing Also, scanned images or digili/ed images from any souice such as
9 cable TV, an antenna, cameras, etc can be used
10 As mentioned above, for the case of video images, dozens of frames per second
1 1 comprising millions of pixels per second must be processed We discovered that standard
12 graphics pipelines can be used to process frames of data sufficiently fast to process video t . images
14 Any type of memory can be used to store the digital 2D image, e.g
I semiconductor memor ies (SRAMs, DR AMs or other semiconductor memories), a
16 magnet ic memory (e g a hard disk, a floppy disk, magnetic tape, or magncto-oplic disk),
17 or other type of memory device (e g an optical disk) The pixels corresponding lo the
18 stoi ed image can be stored in terms of RGB values (e.g the strength of the red, gi ecn and
1 blue components ofthe pixel color), YUV values or other values (For YUV values, Y 0 corresponds to the amplitude or brightness of the pixel value, U corresponds to the color
1 and V corresponds to the saturation ) The pixel values can be encoded in other ways as 2 well Depending on the situation, a conversion may be required before further . processing 1 Next, a 3D graphics pipeline is set up This is accomplished by providing
2 instructions to the 3D graphics pipeline as to what is to be done with the data that is to be . provided Setting up graphics pipelines per se is well known in the art, e g as described 4 m the Microsoft Direct 3D SDK (software developei kit) or Direct 3D s Thereafter, a computer model ofa planar geometric surface is generated fhis computci model can comprise a set of primitives, e g polygons such as triangles In
"' another embodiment, the computer model can comprise an implicit description o flat
8 geometric surface Tins implicit description is typically a mathematical function (e g a
9 function of \, y and ) as described above
10 For the case in which the planar geometric surface comprises a mesh of
1 1 primitives, the number and shape of primitives and the type of primitives can vary Fig
12 2B illustrates a mesh 12 that can be used to practice a method in accordance with our i s invention Mesh 12 is similar to mesh 4 described above However, unlike mesh 4, all of
14 the vertices of mesh 12 are coplanar (or sub.stantially coplanar) In one embodiment i mesh 12 comprises about 5000 triangles, which would be acceptable for processing a
16 video image Of course, other numbers of primitives could be used
17 After constructing the planai geometric surface (e.g esh 12), image 10 is
18 mapped, or bound, onto the flat geometric suiface This is accomplished in the following
19 way For I he case in which ihe flal geometric surface is a mesh such as mesh 12, each
20 vertex of the fiat geometric surface (e.g the triangle vertices) is associated with an image
1 pixel location (i e conlrol point) Thus, each control poinl is associated with a texture 2 coordinates (u, v) corresponding lo a pixel A table of data listing each vertex and its 1 associated u, v texture space coordinates is set up This is called "binding " (See Ka en, I I Fill . Computer Sociely, 1EHJ . Computer Graphics and Applications, Jan -Fob 1 97,
7 Vol 17, No I .) For the case in which an implicit technique is used to define the flat
. geometric surface, control points within the implicitly defined surface are bound to pixel
4 array coordinate space (u, v coordinates) in a manner analogous to the triangles discussed
s above
6 After image 10 is mapped into mesh 1 2, the object can be manipulated by
7 manipulating ihe world coordinates The world coordinates describe where in the x, y, z
8 space the textured plane is to appeal', and what its orientation will be (i e what angle it
9 should be held al with respect to the viewer) In addition, the screen coordinates for the
10 object can be changed As a result, when the 2D textured image is finally prepared, il can
1 1 be prepared in such a manner that reflects the desired manipulation. For example, it can
12 be rotated about any axis, magnified, shrunk, etc
1 After establishing the world coordinate model and screen coordinate model, the
14 pipeline prepares an array of pixels in the output frame buffer (OFB), including pixels l .s showing the manipulated textured mesh 12, The array of pixels in the OFB is displayed
16 on a CRT or other type of screen
17 One can manipulate the video image by, for example, changing the world
18 coor inate parameters, e.g telling the pipeline to tilt the video image about any axis
19 (including an axis perpendicular to the screen or in the plane o the screen) Thus, when
20 ihe pipeline regenerates the pixel array in the OFB, the regenerated video image will
21 appeal lilted about the selected axis Since the pipeline will regenerate the image at a
22 preprogrammed rale according to the system used, live video will appear as live video 3 Thai is because every lime a new pixel array is generaled, Ihe texture map, which I contains the incoming video frame buffer, is reread and put thr ough the pipeline Since
) the lexlure mapping pr ocess also contains features for pixel interpolation, an automat ic
3 r esolution adaptation occurs
t One can bend or warp the image by moving the vertices about which the image is
s" mapped Thus, one can alter the flat geometric plane of Fig 2B to thereby warp the
6 i mage When (he pipeline regenerates the pixel array in the frame buffer, the image will
7 appear arped
8 One can move the vertices so that mesh 1 becomes a cylinder When Ihe
pipeline regenerates the pixel array in the frame buffer, the image will appear wrapped
10 around a cylinder (Of course, the mesh 10 can be altei ed into other shapes, and the
1 1 image would be wrapped around the other shape ) These modifications could be done at
12 a speed that would create the impression in the viewer that the image was being wrapped
1 or warped gradually
14 One could magni fy or shr ink images by moving vertices away from or closer to
15 each other, or moving the image closer or further from the viewer in the world coordinate
16 system, or by re-parameterizing the mode! coordinate to world coordinate conversion
(7
18 I laid ware and Soft war e for lyiaiiiμu lating a TwpJ iιrιensi()naI Lmagc
1
0 One embodiment of our invention can be practiced using a PC having the
! following
I A CPi I such as a Celeron or Pentium., e g. as manufactured by IrHel, or a K6
T processor , e g as manufactured by Advanced Micro Devices 4 2 32 MB of memory or greater 1 1 A 3D I IW adaptei This is a type of graphics card currently available on the
2 market The 3D 1 IW adapter should have 4 MB of memory (preferably 8 MB)
3 and an advanced graphics port (AGP) interface (An AGP interface is a type of
4 bus standard that is well-known in the art ) Alternatively, a peripheral connection s interlace ("PCT") can be used in lieu ofa AGP The PCI is a 1ype of bus standard
6 that is well known in the art Examples of appropriate 3D HW adapteis include
7 the TNT- 2 available fiom Riva, the ATI Rage 128, the Matrox G400, the Trident
8 Blade 3D and the S3 Savage
9 4 The operating system can be Windows 95, Windows 98, Wιn2000, oi any other
10 operating system that supports direct 3D The Windows operating system
1 1 includes a standardized platform called Direct X for Windows
12 In one embodiment, a user sets up the flat geometric surface (for example, a
I . Iiiangle mesh) in the Direct 3D windows environment The set of instructions is then
I I provided lo ihe graphics pipeline, which finishes the rendering process Ifowevei , in
1 s another embodiment, the PC comprises a bypass mechanism that permits one to access
16 the hardware accelerator directly using a soft are interface provided by the graphics cai d
17 manufacturer
18 Fig 3 is a block diagram ofa computer system 50 for generating images that can
19 be used in accordance with the invention Referring to Fig 3, system 50 comprises a 0 CPU 52, e g a Pentium II class CPU, comprising a cache memory 52a, a core 52b arid an 1 internal bus 52c for facilitating communication between core 52b and cache 52a Core
2 52b communicates via a CPU bus 54 to a system controller 56. System controller 56 . communicates with the system memory 58 via a memory bus 60 System memory 58 1 includes a fit si portion 58a which stores system mernoi y programs and a second portion
2 58b that slores the lexture maps such as described above
. Also included in system 50 is a PCI bus 62 for facilitating communication
4 bel ecn system controller 56 and I/O devices 04, 66 and optionally a disk drive 68 I/O
s device 64 can be any type of I/O device In one embodiment, I/O device 64 is a network
6 interlace adapter "NIA" for receiving signals from any type of network, including but nol
7 limited to satellite broadcast, cable broadcast, lelcphony, fiber and topologies such as
8 Wide Area Ndwoi ks (including the Internet), Local Area Networks, Focal Multiple Drop
9 Networks etc In some cmbodimenls, I/O device 64 can be a modem, and in others I/O
10 device 64 is a tuner for receiving television signals etc
1 1 In one embodiment, I/O device 66 is a video capture card with a driver Data
12 fiom the video capture card is either loaded by DMA (direct memoiy access) or CPU 52
I . into a frame buffei , typically within main memory 58 However, the frame buffer may be
14 in oilier memories within system 50 In some embodiments, multiple video streams or
1 s image sources are available, such as local storage, capture card 66, N1Λ 64 or other, not
16 explicit ly shown sources
17 System 50 also includes an AGP graphics controller 70 comprising a 3D
18 acceleraloi In one embodiment, AGP graphics controller 70 communicates with system
19 controller 56 via an AGP bus 72 In an alternative embodiment, AGP giaphics controller
0 70 can communicate wilh system controller 56 via PCI bus 62 (e.g. as shown in phanlom 1 in Fig 3)
Graphics controller 70 uses its own local memory 74 to generate and store pixel . arrays l be displayed on a video display unit 76 1 It is emphasized that system 50 is only one example o system that peifoπns a
2 method m accordance with our invention Other hardware can be used as well
. The above-mentioned method can be used to manipulate image streams such as
4 television images This method is particularly appropriate since video images comprise a
5 succession of frames al a rate of about 60 frames/second in North America Foi instance
6 in the case of NTSC, about 9 I Mbytes per second throughput are required (NTSC is an
7 abbreviation of "North American Television Standard for Color It is the sta daid used
8 for television signals in North America )
9 The system of Fig 3 can move and till portions of different video images or olhei K) images onto different portions of a screen such as a television screen In one
1 1 embodiment, Ihc images are transformed to appear on the faces ofa polyhedron (e g a
12 cube) As explained below, the polyhedron is used as a new type of television menu π option display In particular, a novel remote control device (described below) permits a
14 user to turn the polyhedron to see the different images on ihe various faces ofthe i s polyhedion Aftei the polyhedron is turned in an appropriate orientation, one can "click"
16 on a desiicd polyhedron face, or a portion of desired polyhedron face, to select a desired
17 option
18 The manipulated image provided in accordance with our invention can be t9 pi ovided to any appropriate output device, e g a television screen, a video projector, a
20 I IDTV oniioi , or a PC screen he image manipulated in accordance with out
21 inveπlion could come from any of a number of sources, e.g an analog or digital video 2 input, a cable TV input, a satellite input, the internet, a digital scanner, a digital camera, 1 or numerous other sources (In the case of an analog input, one would fir st digitize the
2 image.)
3
4 Remote Control l)cγΛce U§ed_in Cjonjuijfitior wjth a Visual DispJayDeyice s , Fig 4 illustrates a television 200 in which the television screen 201 is divided into
7 a primary portion 202 and a secondary portion 203 Primary portion 202 displays a
8 primary video image Secondary portion 203 depicts a polyhedron 204 in accordance
9 with the invention Television 200 includes a controller 205 for generating an image of
10 polyhedron 204 This controller can include the hardware elements depicted in Fig 3
1 1 The faces of polyhedron 204 can depict video images, e g images of what appears on the
12 various television channels. A given face of polyhedron 204 can also include both video
13 images and additional information (e.g in the form of alphanumeric characters or icons),
14 e.g. the program name, channel number, etc. In addition, the faces of the polyhedron can
i s depict icons concerning various options, e.g. options related to television volume, on/off
16 switches, conlrol o a VCR, options related to editing video images, etc. (Images
17 corresponding to such icons are stor ed in a memory within television controller 205 )
18 Any appropriate screen and display technology can be used for television 200
19 Television 200 is controlled by r emote control device 206, which communicates with
0 television 200 by emitting a signal, e g. an infrared, radio or other type of signal thai can 1 be transmitted and received (Remote conlrol devices that communicate with a television 2 using infrared signals are well known in the art. See U.S Patent 4,91 ,439, for example )
3 In lieu o , or in addition to emitting an IR or radio signal, remote control device 206 can 4 be connected to and communicate with television 200 by a wire 1 In one embodiment, remote control device 206 has the capability of sensing
2 motion, c g as symbolized by arrow 207, indicating rotation of device 206. Such rotation
3 is sensed, e.g by techniques described below Signals indicating such rotation are
4 communicated 1o a receiver within television 200, which in turn sends commands to
s controller 205 A 3D pipeline within controller 205 orients polyhedron 204 in a manner
6 thai minors olion of remote control device 206, eilher identically or partially Foi
7 example, the signal depicting motion of remote control device 206 can be filtcied to
8 eliminate jerking movements. Fig 5 illustrates several features of remote control device 206 Element 21 1 is a
10 transmitter for communicating with television 200 Element 21 1 is typically an IR
1 1 transmitter, but it could also be an ultrasonic, radio, magnetic induction, or other type of
12 non direct ional communication device. Since IR is somewhat directional, more than one n emittci may be used in order to guarantee proper communication while handling remote 14 control device 206
i Also shown in Fig, 5 are a set of buttons and displays, e g light emitting diodes oi
16 liquid crystal displays, possibly trackballs, etc., symbolized as three fields 212, 21 and
17 21 Also shown in phantom are a battery 220, a printed circuit board 221 (containing a
18 microcontroller 222 with built in program store), and two motion detectors 230a, 230b
19 By calculat ing the difference in motion of these two detectors 230a, 230b, 0 mici (.controller 222 can determine Ihe relative motion of detectors 230a, 230b as well as
1 1 he direction of motion The two motion detectors 230a, 230b thus permit
2 microcontroller 222 to determine which way device 206 is turned. In response to such 3 motion, microcontroller communicates to controller 205 the manner in which remote 1 control device 206 has been manipulated Controller 205 alters the image of polyhedron
2 204 appropriately The user of remote controller 206 selects an option corresponding to
an image facing the user by pressing an appropriate button the remote control device
l (Microcont roller 222 reads or senses the various buttons and other input devices on
s remote control device 206, and provides appropriate signals to controller 205 in response
6 thereto )
7 As ment ioned above, conl roller 205 w ithin television 200 causes motion of
8 polyhedron 204 to mirror motion of controller 200 Each face of polyhedron 204 depicts
one oi more menu options that a user can select by pressing appropriate buttons on
10 remote control device 206 In this way a user can select a television channel, increase or
1 1 lowei volume, turn the television on or off, select a signal source for the television (e g
12 selecting between cable TV, a VCR or the internet), etc In addition, one can cause an π image on one of Ihe polyhedron faces to appear on main portion 201 ofthe television i t screen
I s In lieu of motion del colors 230a, 230b. other position or motion detection devices
16 can be used, e g gyroscopes, GPS (global positioning system), or other inertia or position
17 tracki g devices
18 11 is noted thai remote conlrol device 206 is different from other types of remote
19 control devices For example, while trackballs cooperate with mechanical struct ures foi
2.0 sensing the motion of the trackball, Ihe trackball can only be used with the ball
21 mechanically resting against those structures Remote control device 206 can sense its
22 own motion although it is not mechanically tethered to other sensors, or does nol
23 mechanically resl against other sensors In particular , remote conlrol device 206 does not 1 i cqiiiic a stationary non-moving component to determine the motion of remote conlrol
2 device 206
, Remote control device 206 is a preferred structure for manipulating polyhedron
4 204 1 lowever, other structures can be used for manipulating polyhedron 204, e.g s control buttons or track balls Fig 6 illustrates a remote control device 240 comprising a
6 (rack ball 242 During use, an operator rotates track ball 242 The rotation oiJrack ball
7 2/12 is sensed by remote control device 240, and a signal is communicated by device 240
8 to controller 205 within television 200, which causes polyhedron 204 to rotate
9 In another embodiment, a remote control device 244 is used which comprises a 0 rotating wheel 246 (Fig 7) During use, an operator rotalcs wheel 246, which is sensed 1 by remote control device 244 A signal is thus communicated by device 240 to conlroller 2 () wilhrn television 200, which causes polyhedron 204 to rotate. 3 In Ihe above-mentioned embodiments, video streams are bound to the various 4 geometric surfaces forming the polyhedron. The polyhedron is rotated by altering the
s woild coordinate system thai is applied to the 3D pipeline In one embodiment, CPl ' 52 6 within cont toller 205 determines what change is to be made to the world coordinate 7 system to in response to the signal controller 205 receives from remote control device 8 206 (or remote conlrol device 240 or 244 as the case may be) 9 In one embodiment, the image on the primary portion 202 of television screen 201 0 is also bound lo a geometric surface by a 3D graphics pipeline One can rotate or 1 manipulate ihe image on primary portion 202 of the television screen using the remote 2 control device 1 Although the geometric surfaces in second portion 203 of the television screen
2 form a polyhedron in the embodiment of Fig. 4, in other embodiments, the geometric surfaces do not form a polyhedron.
4 In one embodiment, a band of images 204' is provided in second portion 203 of
5 television screen 201 (Fig. 8) One moves band 204' by moving remote control device
6 206 As band 204' moves (e.g as symbolized by arrow 208), different images become
7 visible For example, image 204a on the far right of band 204' will disappear and image
8 204b will lake its place A new image will appeal at the position of image 204c
9 One can select an image (or a menu option represented by that image) by
10 selecting an image that visually appears parallel to screen 201 of television 200 in othei
I i words, by actuating an appropriate control button on control device 206, the image thai
12 appears parallel to screen 201 (typically center-most image 204d) is selected In another π embodiment, one can move a cursor (using a control button on control device 206) to
I -4 point to a particular image within band 204', and then actuate another control button to is select thai image (or the menu option represented by that image) Alternatively, one of
16 the posilions along band 204' can be highlighted or otherwise marked as representing an
17 image to be selected One can move differem images to the marked position to select that
18 image
ll> Band 204' can be either a "closed band" or an "open band " By "closed band" 1 0 mean a band whereby scrolling band 204' far enough in one direction (e.g rotating the 1 band 60 degrees to the right) will eventually result in the same images being returned to 2 their original position This is to be contrasted with an open band of images in which ) moving ihe images to the right, for example, will eventually expose a left -most image,
2 with no image exposed to the left of that lefi-most image Selection of images on a face of polyhedron 204 can be accomplished in a manner
4 srmilai to image selection for band 204' s While the invention has been described with respect to specific embodiments,
those skilled in the art will appreciate that changes can be made in form and detail
7 without departing from the spirit and scope ofthe invention. For example, instead of
8 using a polyhedron, other multi-face images can be used in the above-described manner
9 Further, a plurality of polyhedra or bands can be depicted and manipulated on a television
10 scieen Each face of the polyhedra or bands can include two or more portions thai can
1 1 depict various options As menlioned above, instead of using motion detectors within the
1 remote control device- position detectors can be used Different types of display devices π can be used in conjunction with our invention, e.g CRT screens, LCD screens, or other 14 display devices Accordingly, all such changes come within the invention

Claims

I claim: 1. Method comprising: providing a display screen and a control device, said display screen displaying a plurality of surfaces, an image being depicted on each of said surfaces within said plurality of surfaces; actuating a control input to said control device; and manipulating the orientation of said surfaces in response to said control input.
2. Method of claim 1 wherein said images on said surfaces are provided by a graphics pipeline.
3. Method of claim 1 wherein at least some of said images comprise video streams.
4. Method of claim 1 wherein said images on said surfaces depict menu options, said method further comprising the step of selecting one of said depicted options.
5. Method of claim 1 wherein said control device is a remote control device that senses the motion and/or position of said remote control device.
6. Method of claim 1 wherein said control device is a remote control device comprising a rotation wheel, such that a user can rotate said rotation wheel, said step of manipulating comprising the step of manipulating the orientation of said surfaces in response to said rotation.
7. Method of claim 1 wherein said control device is a remote control device that comprises a track ball, such that a user can rotate said track ball, said step of manipulating comprising the step of manipulating the orientation of said surfaces in response to said rotation of said track ball.
8. Method of claim 1 wherein said surfaces form a polyhedron, said act of actuating changing the orientation of said polyhedron.
9. Method of claim 8 further comprising the step of applying said images to said polyhedron with a graphics pipeline.
10. Method of claim 1 wherein said surfaces form a band of images.
11. Method comprising the steps of: displaying a set of images on a display screen, said images corresponding to a control option that can be exercised, such that the display of said images represents a display of a list of control options that can be exercised; actuating a control to thereby move said plurality of images on said screen, thereby changing the list of options being displayed that can be exercised.
12. Method of claim 11 wherein at least some of said images are video images.
13. Method of claim 11 further comprising the step of selecting one of said images, thereby selecting an option corresponding to said selected image.
14. Method of claim 1 1 wherein said images are arranged as a band of images.
15. Method of claim 11 wherein said images are arranged to form a polyhedron.
16. Method comprising the steps of: displaying a plurality of surfaces on a display screen, an image appearing on said surfaces; and rotating said images in response to actuation of a control device.
17. Method of claim 16 wherein the surfaces form a polyhedron.
18. Method of claim 17 wherein said screen comprises a main portion and a secondary portion, said polyhedron being displayed on said secondary portion, said method further comprising the step of selecting one ofthe images of said polyhedron and displaying said selected image on said main portion.
19. Method of claim 16 wherein the surfaces form a band of images.
20. Apparatus comprising: a display device displaying a plurality of images on a plurality of surfaces; and a control device, wherein actuation of said control device rotates said surfaces.
21. Apparatus of claim 20 further comprising a graphics pipeline for generating said plurality of images on said plurality of surfaces.
22. Apparatus of claim 20 wherein said surfaces form a polyhedron.
23. Apparatus of claim 20 wherein at least some of said images comprise video images.
24. Apparatus of claim 20 wherein each of said images depicts a menu option, and said control device is a remote control device containing a control for selecting one of said depicted options.
25. Apparatus of claim 20 wherein said control device is a remote control device that senses the motion and/or position of said remote control device.
26. Apparatus of claim 20 wherein said control device is a remote control device that comprises a rotation wheel, such that a user can rotate said rotation wheel, wherein said graphics pipeline manipulates the orientation of said surfaces in response to rotation of said wheel.
27. Apparatus of claim 20 wherein said control device is a remote control device that comprises a track ball, such that a user can rotate said track ball, said step of manipulating comprising the step of manipulating the orientation of said surfaces in response to said rotation of said track ball.
28. Apparatus comprising: a display screen for displaying a plurality of images, each of said images corresponding to a control option; a control device for moving said plurality of images, whereby different sets of images corresponding to different control options can be displayed on said display screen, said control device also comprising a control element for selecting one of said options.
29. Apparatus of claim 28 wherein at least some of said images are video images.
30. Apparatus of claim 28 wherein said images are arranged as a polyhedron, the orientation of said polyhedron being controlled by said control device.
31. Apparatus of claim 28 wherein said images are arranged as a band of images.
32. Apparatus of claim 28 wherein said display screen comprises primary and secondary regions, said plurality of images being displayed in said secondary region, at least some of said images within said plurality of images corresponding to control options of what is to be displayed in said primary region.
33. Apparatus comprising: an image display device; and a remote control device for being held in a user's hand and for controlling the image displayed on said image display device, said remote control device detecting the angle or position at which the user is holding said remote control device.
34. Apparatus of claim 33 wherein said remote control device further comprises two motion sensors and a circuit for calculating the difference between the motion ofthe two motion sensors.
35. Apparatus of claim 33 wherein said remote control device further comprises a gyroscope for sensing motion and/or position of said remote control device.
36. Apparatus of claim 33 wherein said remote control device wherein the position and/or motion of said remote control device is sensed using a global position system.
37. Apparatus of claim 33 wherein said remote control device determines said angle or position without reference to the position of a fixed non-moving structure mechanically coupled to a moving structure.
38. Apparatus of claim 33 wherein said display device comprises a screen for displaying an image, said image comprising a plurality of faces, the orientation of said faces being changed in response to the position of said remote control device.
39. Apparatus of claim 38 wherein said plurality of faces form at least one polyhedron.
40. Apparatus of claim 38 wherein said plurality of faces forms a band of images.
41. Apparatus of claim 38 further comprising a graphics pipeline for providing an image on each ofthe faces within said plurality.
42. A remote control device for being held in a user's hand, said remote control device comprising: first and second motion sensors; and a circuit for determining the motion and/or position of said remote control device based on the motions sensed by said first and second motion sensors, said circuit providing a signal indicative ofthe motion and/or position in which the remote control device is being held.
43. A remote control device for being held in a user's hand, said remote control
device comprising: a gyroscope; and a circuit for determining the motion and/or position of said remote control device based on motion sensed by said gyroscope, said circuit providing a signal indicative of the motion and/or position in which the remote control device is being held.
44. A remote control device for being held in a user's hand, said remote control device comprising: first and second position sensors; and a circuit for determining the motion and or position of said remote control device based on the positions sensed by said first and second position sensors, said circuit providing a signal indicative ofthe motion and/or position in which the remote control device is being held.
45. A remote control device comprising a member for being held by the hand ofa user, said remote control device providing a signal indicative ofthe motion and/or position in which said member is being held without said member being mechanically coupled to a second structure and generating a signal indicative ofthe relative motion between said member and said second structure.
46. A method for using a remote control device, said remote control device comprising: first and second motion sensors; and a circuit for determining the motion and/or position of said remote control device
based on the motions sensed by said first and second motion sensors, said circuit providing a signal indicative ofthe motion and/or position in which the remote control device is being held, said method comprising: moving said remote control device; and causing said circuit to calculate the motion and/or position of said remote control device; and providing a signal indicative of said motion and/or position.
47. A method for using a remote control device, said remote control device comprising: a gyroscope; and a circuit for determining the motion and/or position of said remote control device based on motion sensed by said gyroscope, said circuit providing a signal indicative of the motion and/or position in which the remote control device is being held, said method comprising the step of moving said remote control device; and causing said circuit to emit a signal indicative ofthe motion and/or position of said remote control device.
48. A method for using a remote control device, said remote control device comprising first and second position sensors, said method comprising the steps of: calculating the position and/or motion of said remote control device in response to the position sensed by said position sensors; and providing a signal indicative of said calculated position and/or motion.
49. A remote control device comprising a member for being held by the hand ofa user, said remote control device providing a signal indicative ofthe motion and/or position in which said member is being held without said member being mechanically coupled to a second structure and generating a signal indicative ofthe relative motion between said member and said second structure.
50. A method for using a remote control device, said remote control device comprising: a member for being held by the hand ofa user, said remote control device providing a signal indicative ofthe motion and/or position in which said member is being held without said member being mechanically coupled to a second structure and generating a signal indicative ofthe relative motion between said member and said second structure, said method comprising: grasping said remote control device and moving said remote control device to thereby cause said remote control device to generate a signal indicating the motion and/or position of said remote control device.
PCT/US2000/002870 1999-02-03 2000-02-02 Novel method and apparatus for controlling video programming WO2000046680A1 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US11850599P 1999-02-03 1999-02-03
US60/118,505 1999-02-03
US09/344,442 1999-06-25
US09/344,442 US6342884B1 (en) 1999-02-03 1999-06-25 Method and apparatus for using a general three-dimensional (3D) graphics pipeline for cost effective digital image and video editing, transformation, and representation
US37818499A 1999-08-20 1999-08-20
US37827099A 1999-08-20 1999-08-20
US09/378,270 1999-08-20
US09/378,184 1999-08-20

Publications (1)

Publication Number Publication Date
WO2000046680A1 true WO2000046680A1 (en) 2000-08-10

Family

ID=27494201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/002870 WO2000046680A1 (en) 1999-02-03 2000-02-02 Novel method and apparatus for controlling video programming

Country Status (1)

Country Link
WO (1) WO2000046680A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1821529A2 (en) * 2006-02-17 2007-08-22 Samsung Electronics Co., Ltd. Three-dimensional electronic programming guide providing apparatus and method
EP2259579A2 (en) 2000-01-16 2010-12-08 JLB Ventures LLC Electronic Programming Guide
US20110022988A1 (en) * 2009-07-27 2011-01-27 Lg Electronics Inc. Providing user interface for three-dimensional display device
CN102404524A (en) * 2010-09-14 2012-04-04 康佳集团股份有限公司 Video menu setting method, device and 3D television

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5339095A (en) * 1991-12-05 1994-08-16 Tv Interactive Data Corporation Multi-media pointing device
US5452414A (en) * 1990-05-09 1995-09-19 Apple Computer, Inc. Method of rotating a three-dimensional icon to its original face
US5459489A (en) * 1991-12-05 1995-10-17 Tv Interactive Data Corporation Hand held electronic remote control device
US5515486A (en) * 1994-12-16 1996-05-07 International Business Machines Corporation Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects
US5598187A (en) * 1993-05-13 1997-01-28 Kabushiki Kaisha Toshiba Spatial motion pattern input system and input method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5452414A (en) * 1990-05-09 1995-09-19 Apple Computer, Inc. Method of rotating a three-dimensional icon to its original face
US5339095A (en) * 1991-12-05 1994-08-16 Tv Interactive Data Corporation Multi-media pointing device
US5459489A (en) * 1991-12-05 1995-10-17 Tv Interactive Data Corporation Hand held electronic remote control device
US5598187A (en) * 1993-05-13 1997-01-28 Kabushiki Kaisha Toshiba Spatial motion pattern input system and input method
US5515486A (en) * 1994-12-16 1996-05-07 International Business Machines Corporation Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2259579A2 (en) 2000-01-16 2010-12-08 JLB Ventures LLC Electronic Programming Guide
EP1821529A2 (en) * 2006-02-17 2007-08-22 Samsung Electronics Co., Ltd. Three-dimensional electronic programming guide providing apparatus and method
EP1821529A3 (en) * 2006-02-17 2009-10-14 Samsung Electronics Co., Ltd. Three-dimensional electronic programming guide providing apparatus and method
US8613018B2 (en) 2006-02-17 2013-12-17 Samsung Electronics Co., Ltd. Three-dimensional electronic programming guide providing apparatus and method
US20110022988A1 (en) * 2009-07-27 2011-01-27 Lg Electronics Inc. Providing user interface for three-dimensional display device
CN101969573A (en) * 2009-07-27 2011-02-09 Lg电子株式会社 Graphic user interface for three-dimensional image display device
EP2293174A1 (en) * 2009-07-27 2011-03-09 Lg Electronics Inc. Graphic user interface for three-dimensional image display device
US8413073B2 (en) 2009-07-27 2013-04-02 Lg Electronics Inc. Providing user interface for three-dimensional display device
CN101969573B (en) * 2009-07-27 2013-04-10 Lg电子株式会社 User interface for three-dimensional image display device
CN102404524A (en) * 2010-09-14 2012-04-04 康佳集团股份有限公司 Video menu setting method, device and 3D television

Similar Documents

Publication Publication Date Title
US6525728B2 (en) Method and apparatus for using a general three-dimensional (3D) graphics pipeline for cost effective digital image and video editing, transformation, and representation
CN104321803B (en) Image processing apparatus, image processing method and program
US6968973B2 (en) System and process for viewing and navigating through an interactive video tour
US5963215A (en) Three-dimensional browsing of multiple video sources
US6346967B1 (en) Method apparatus and computer program products for performing perspective corrections to a distorted image
US6760026B2 (en) Image-based virtual reality player with integrated 3D graphics objects
US6456287B1 (en) Method and apparatus for 3D model creation based on 2D images
US6370267B1 (en) System for manipulating digitized image objects in three dimensions
CN105144229B (en) Image processing apparatus, image processing method and program
US6570581B1 (en) On-location video assistance system with computer generated imagery overlay
WO2012046372A1 (en) Image generation device, and image generation method
USRE43490E1 (en) Wide-angle dewarping method and apparatus
CN108292489A (en) Information processing unit and image generating method
JPH09238367A (en) Television signal transmission method, television signal transmitter, television signal reception method, television signal receiver, television signal transmission/ reception method and television signal transmitter-receiver
JPH10232940A (en) Device and method for corner detection
GB2313246A (en) Channel selecting system
US20060114251A1 (en) Methods for simulating movement of a computer user through a remote environment
WO1997042601A1 (en) Integrated interactive multimedia process
Bradley et al. Image-based navigation in real environments using panoramas
CN107005689B (en) Digital video rendering
Endo et al. Image-based walk-through system for large-scale scenes
WO2000046680A1 (en) Novel method and apparatus for controlling video programming
JP2000067227A (en) Image display device method and recording medium
Honkamaa et al. A lightweight approach for augmented reality on camera phones using 2D images to simulate 3D
JP4498450B2 (en) Display device

Legal Events

Date Code Title Description
AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase