WO2009083863A1 - Playback and overlay of 3d graphics onto 3d video - Google Patents

Playback and overlay of 3d graphics onto 3d video Download PDF

Info

Publication number
WO2009083863A1
WO2009083863A1 PCT/IB2008/055338 IB2008055338W WO2009083863A1 WO 2009083863 A1 WO2009083863 A1 WO 2009083863A1 IB 2008055338 W IB2008055338 W IB 2008055338W WO 2009083863 A1 WO2009083863 A1 WO 2009083863A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
stream
graphics
depth
video
Prior art date
Application number
PCT/IB2008/055338
Other languages
French (fr)
Inventor
Francesco Scalori
Philip S. Newton
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2009083863A1 publication Critical patent/WO2009083863A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Definitions

  • the present invention relates to a method of playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream.
  • the invention also related to an apparatus for playback of the information stream as described herein-above and to a signal comprising the information stream as described herein-above.
  • 3D displays With the introduction of new 3D displays, there is an opportunity for 3D video to break through to the mass consumer market. Such 3D displays are able to handle both 3D display and 2D display.
  • introducing 3D video does not only relate to introducing new displays capable of 3D display, but it also has impact on the whole content production and delivery chain.
  • the production of 3D video content is at an embryonic technology stage and various formats are proposed to be used each with their own advantages and disadvantages.
  • new coding methods were introduced for coding 3D content and new formats were proposed to include the 3D video stream in MPEG streams.
  • a known fact is that introduction of new formats is usually slow and a desired feature when introducing new format is backwards playback compatibility with the installed player base.
  • a missing area has been the carriage of 3D video content in a content distribution or publishing format such as Digital Video Broadcasting (DVB) or DVD and high definition format such as Blu-ray Disc (BD) or HD-DVD while maintaining backwards compatibility with the installed player base.
  • An important feature high definition publishing formats is the ability of content providers to provide multiple video stream such as picture-in-picture and graphics and interactive streams. For example, in case of BD, DVD and HD-DVD it is known that such systems allow playback of video and graphics (e.g. subtitles, navigation buttons) at the same time. Usually graphic stream such as subtitles should always appear in front of the main video and therefore are added later to the final picture to be displayed.
  • every pixel, belonging to either video or graphics stream has a depth relative to the display. Such depth is either directly associated therewith if a 2D+ depth coding of the 3D streams is used, or the depth information can be directly inferred from other coding systems, such as 2D + parallax information.
  • a 2D+ depth coding of the 3D streams is used, or the depth information can be directly inferred from other coding systems, such as 2D + parallax information.
  • Figure Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems.
  • a main movie plane In such systems there exists a main movie plane, a presentation plane comprising static graphic objects, and an interactive plane comprising interactive objects.
  • the three planes are overlayed on each other: the main movie plane in the background, the presentation plane on top of the main movie and the interactive plane most forward.
  • the right image in Fig. Ia indicated the outputted image with the three planes overlayed.
  • Figure Ib is a 2D representation of how such planes might intertwine in case of 3D display of each stream. Due to depth, some part of the main movie plane may have a depth closer to the viewer than that of the graphics items. In such parts, the foreground graphic objects are punctured and text becomes difficult to read, while the general aspect of the displayed image is broken and unpleasing. In case of graphics streams, this is particularly problematic as the graphics may appear at any location in the video and is dependent on input from the user.
  • the object of the invention is reached by a method according to claim 1 for playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream.
  • the method comprises reading or receiving the information stream; determining an available depth range for 3D display, attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream, scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream.
  • This is based on the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assigning them to the existing presentation planes, such as the video and graphics plane, followed buy rescaling of the depth of each streams to assigned range.
  • the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth).
  • this concept is applicable to any playback system that displays at least two overlapping 3D graphic streams or non-moving stream of pictures, such as a slideshows, or two overlapping video streams.
  • the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D graphics objects on top of 3D graphic backgrounds.
  • the information stream further comprises overlay information for overlaying the at least one graphics stream onto the main video stream, the overlay information comprising the non-overlapping depth ranges wherein the non-overlapping depth ranges are preferably defined as depth percentages of the available depth range.
  • the range limits could be defined while authoring the content, hence giving authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non-absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • the information stream is BD compatible and it comprises a video stream, a graphics stream and an interactive graphics stream, the interactive graphics stream being displayed in front of the graphics stream, which is displayed in front of the main video stream.
  • an optimal value for the depth ranges corresponds to the depth ranges to the main movie stream, graphics stream and interactive graphics stream being in the ratio 5:3:2.
  • the invention is also related to an apparatus for playback of an information stream suitable to be played back in on a three-dimensional (3D) display as defined in claim 7 and a signal as defined in claim 11.
  • Fig. Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems
  • Fig. Ib is a 2D representation of occlusion of the graphics stream by the video stream when both streams are displayed in 2D.
  • Fig. 2 illustrates schematically a playback device wherein the invention is practiced
  • Fig. 3 illustrates schematically various presentation planes and the associated depth ranges according to an embodiment of the invention
  • Fig. 4 illustrates schematically an embodiment according to the invention of the real processing unit and the rendering stage.
  • Fig. 2 illustrates schematically a playback device wherein the invention is practiced. It is dully noted that this described a particular embodiment corresponding to playback from optical discs, but the source of the information stream is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet.
  • the invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
  • Optical discs having a track, the track being the position of the series of prerecorded marks representing information, and arranged in accordance with a single spiral pattern constituting substantially parallel tracks on an information layer.
  • the optical disc may comprise one or more information layers of a recordable type.
  • prerecorded optical discs are CD-ROM, or DVD-ROM or high density disc such as HD DVD- ROM or BD-ROM.
  • CD-ROM and DVD-ROM optical discs are CD-ROM, or DVD-ROM or high density disc such as HD DVD- ROM or BD-ROM.
  • references ECMA- 130 and ECMA-267 ISO IEC 16449
  • the information is represented on the information layer by optically detectable marks along the track.
  • the track 12 on the optical disc is indicated by a pre-embossed track structure provided during manufacture of the blank optical disc.
  • the track structure is constituted, for example, by a pregroove, which enables a read/write head to follow the track during scanning.
  • the optical disc is intended for carrying user information according to a standardized format, to be playable on standardized playback devices.
  • the recording format includes the way information is recorded, encoded and logically mapped onto the recording space provided by the track.
  • the recordable space is usually subdivided into a lead-in area (LI) 31, a data zone (DZ) for recording the information and a lead-out area (LO).
  • the lead-in area (LI) usually comprises basic disc management information and information how to physically access the data zone (DZ).
  • said basic disc management information corresponds to the table of contents in CD systems or the formatting disc control blocks (FDCB) in DVD systems.
  • the user information recorded in the data zone (DZ) is further arranged according to an application format, for example comprising a predefined structure of files and directories.
  • the user information in the data zone is arranged according to a file system comprising file management information, such as ISO 9660 used in CD systems, available as ECMA-119, or UDF used in DVD systems, available as ECMA- 167.
  • the recording device is provided with scanning means for scanning the track of the optical disc, the scanning means comprising a drive unit 16 for rotating the optical disc 11, a head 18, a positioning unit 21 for coarsely positioning the head 18 in the radial direction on the track, and a control unit 17.
  • the head 18 comprises an optical system of a known type for generating a radiation beam 20 guided through optical elements for focusing said radiation beam 20 to a radiation spot 19 on the track 12 of the optical disc 11.
  • the radiation beam 20 is generated by a radiation source, e.g.
  • the head further comprises (not shown) a focusing actuator for moving the focus of the radiation beam 20 along the optical axis of said beam and a tracking actuator for fine positioning of the radiation spot 19 in a radial direction on the center of the track.
  • the tracking actuator may comprise coils for radially moving an optical element or may alternatively be arranged for changing the angle of a reflecting element.
  • the radiation reflected by the information layer is detected by a detector of a usual type, e.g. a four-quadrant diode, in the head 18 for generating a read signal and further detector signals, such as a tracking error and a focusing error signal for controlling said tracking and focusing actuators.
  • the control unit 17 controls the retrieving of information from the optical disc 11, and may be arranged for receiving commands from a user or from a host computer. To this end, the control unit 17 may comprise control circuitry, for example a microprocessor, a program memory and control gates, for performing the procedures described hereinafter.
  • the control unit 17 may also be implemented as a state machine in logic circuits.
  • the read signal is processed by a read processing unit comprising a demodulator 26, a de-formatter 27 and output unit 28 for processing the information and outputting said information to suitable means, such as display, speakers.
  • the functioning of the demodulator 26, the de-formatter 27 and the output unit 28 are controlled by the controller 17.
  • retrieving means for reading information include the drive unit 16, the head 18, the positioning unit 21 and the read processing unit.
  • the demodulator 26 is responsible for de-modulating a data signal from the channel signal, by using suitable channel decoder, e.g. as disclosed in US 5,920,272 or US 5,477,222.
  • the de-formatter 27 is responsible for using error correction codes and/or de-interleaving for extracting the information signal from the data signal.
  • the output unit 28 under the control of the control unit 17, is responsible for processing the information signal at logical level. Furthermore, it is noted that the information signal may be arranged according to a playback format, which may prescribe that management information is associated to the audio-video information. Hence the output unit is responsible for separating management information from the audio- video information, and for de-multiplexing and/or decoding the audio and/or video information. Suitable compression/de-compression means are described for audio in WO 98/16014-A1 (PHN 16452), and for video in the MPEG2 standard (ISO-IEC 13818). The recording format in which this the user information is to be recorded prescribes that management information for managing the recorded user information is also recorded onto the optical disc.
  • the video and audio information generated by the output unit 28 is sent to suitable means, such a suitable display for the video information.
  • suitable means such a suitable display for the video information.
  • 3D displays are known, one of them being described in US 6,069,650.
  • the display device comprises an LCD display comprising actively switchable Liquid Crystal lenticular lens. Depending on the image content a defined set of locations at the display can be switched to either 2D or 3D mode.
  • each plane is linked to an output of the dedicated decoder.
  • Primary video plane moving or still picture data from Primary video decoder is presented.
  • Secondary video plane moving picture data from Secondary video decoder is presented.
  • Presentation Graphics plane graphic data from either Presentation Graphics decoder or Text subtitle decoder is presented. And these data on two planes are firstly overlaid to make interim video data. Transparent ratio between the two planes is defined as alpha value in CLUT of Presentation Graphics plane.
  • the multiple views necessary for a 3D display can be computed based on a 2D picture and an additional picture, a so-called depth map, as described in Oliver Sheer- "JD Video Communication” , Wiley, 2005, pages 29-34.
  • the depth map conveys information about the depth of objects in the 2D image.
  • the grey scale values in the depth map indicate the depth of the associated pixel in the 2D image.
  • a stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation.
  • an MPEG 3D video stream would comprise a 2D video stream (as either a one program - or a elementary video transport stream) and, multiplexed with the 2D video stream, an auxiliary stream comprising additional information to enable 3D display (such as a depth map stream).
  • the 2D video + depth map was described as the preferred format for implementing the invention, it is not the only format that can be supported.
  • the 2D video + depth map may be extended by adding background de-occlusion information and transparency information, or stereo + depth may be used as input format.
  • the multiple views may be used as input signal and mapped directly onto the display (sub) pixels.
  • the format 2D + depth as previously described, i.e. the full resolution image is divided into four quadrants and one is used for the 2D content while another carries depth information.
  • each plane has before they are composed together into the final image that will be shown on the screen.
  • the inventors had the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assign them to the existing presentation planes, such as the video and graphics planes.
  • the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth).
  • this concept is applicable to any playback system that displays at least two overlapping 3D (static) stream of pictures, such as a slideshows, or two overlapping video streams.
  • the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D objects on top of 3D backgrounds.
  • the range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non-absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • a method of playback in a basic embodiment of the invention, wherein 3D objects are overlaid over a 3D video stream, comprises steps of:
  • a second embodiment of the invention illustrated in Fig. 3, this is extended to three planes, such as a video plane and two graphics planes as used in BD systems.
  • 35, 36, and 37 indicate the relative depth of each of the Main Movie plane, Presentation plane and Interactive plane.
  • depth is illustrated as increased in the opposite direction to the viewer.
  • the range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • a preferred choice of ranges is 50% to the main movie plane, 30% to the presentation plane and 20% to the interactive plane.
  • a further problem addressed by the inventors is how to provide such depth range choice needed to avoid the occlusion problems in a way that maintains backward compatibility with known systems.
  • BD systems it is known that three types of graphics segments exist - Object Definition Segments, which store the bitmap values of a certain graphics object: - Palette Definition Segments which provide the mapping between those values and real colours - Presentation and Interactive Composition Segments which provide information on the way in which the current graphics element should be added to the graphics plane.
  • Depth Map Object Definition Segment When implementing 3D objects, it is expected that two extra types of segments are used, namely Depth Map Object Definition Segment and Depth Map Palette Segment exist along with the previous ones.
  • a new data field, called depth_percentage is added to the Composition Segment structure. Since Composition Segments also hold a reference to the Depth map Palette Segment to be used with a certain graphics object, this allows rescaling that depth map according to the expressed percentage. Therefore this enables the association of a different portion of depth to different graphics planes, for example 20% to subtitles and 30% of the whole depth to interactive menus.
  • the same effect may be achieved if the depth_percentage field directly is included into the Depth Map Palette Segment definition.
  • the information stream processed by the output unit 28 is provided to a video processing unit 31, responsible for implementing the known function of the player model, such as buffering, demultiplexing, processing each elementary stream, executing received commands.
  • the video processing unit 31 is usually implemented as a combination of software and hardware.
  • the processed video information stream is provided to a rendering unit, which is responsible to processing the video information into a signal suitable for 3D display. It is noted that the rendering unit may also be implemented in the 3D display itself.
  • the function of the control unit 17, of the video processing unit 31 and of the rendering stage 32 may be implemented in a device by the same hardware and/or software block.
  • control unit 17 is adapted to determine an available depth range for the 3D display and for attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream and for scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range while the rendering unit adapted to use the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) rendering of the information stream.
  • depth determination means 29 are provided in the control unit, preferably implemented as firmware or as embedded software.
  • two additional function need to be performed within the functional block comprising the video processing unit 31 and the rendering stage 32, a specific embodiment of which is illustrated in more detail in Fig 4.
  • the received stream is demultiplexed and buffered in unit 41, the graphic stream sent to the Stream graphic processor 42.
  • two buffers 43 and 44 For generating the graphic images to be displayed for each of the right and left view, two buffers 43 and 44, under the control of a graphic controller are provided.
  • the two buffers 43 and 44 supply the two graphics plane processors 45 and 46.
  • both graphics decoders (43, 45 and, respectively, 44, 46) are adapted to take into account the value of depth_percentage present in the Composition Segments.
  • the depth map palette has to be adapted to the depth_percentage value, i.e. the minimum and maximum depth contained in the palette have to be within that percentage of the whole possible depth values.
  • BD and HD-DVD also support a secondary video plane for PIP.
  • the video in the PIP may appear side by side with the main video or in a quarter of the screen.
  • the primary video cannot be scaled so the secondary video always covers part of the primary video.
  • the secondary video is either fully transparent or fully opaque. So it is advantageous that, when the secondary video is fully opaque, the primary video does not punch through the other video during rendering of the 2D +depth information in the display, in a similar way as what can happen with the graphics.
  • an alternative solution is possible in the case of overlaying two video streams.
  • the primary video and the secondary video are combined both for the 2D and for the depth information.
  • the data from the secondary video plane simply overwrites the pixels of the primary video on the same presentation plane completely.
  • Such solution is possible in view of the fact that there is no semi-transparency and there is no strong requirement for occlusion information of the primary video that is concealed by the secondary video.
  • This in contrast to overlaying a graphics stream as the graphics stream may be semi-transparent and may overlay the video stream in any shape and location.
  • exemplary embodiments of the invention hereinabove were done with reference to a playback device for playback of information from an optical disc. It is noted that the source of the information is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet.
  • the invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
  • the invention relates three- dimensional (3D) display of an information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three- dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three- dimensional (3D) display of the at least one graphics stream.
  • a method comprises reading or receiving the information stream; determining an available depth range for 3D display; attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream; scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream.
  • the invention enables overlaying 3D graphics onto 3D without unwanted occlusion problems.
  • a computer program may be stored/distributed on a suitable medium, such as optical storage or supplied together with hardware parts, but may also be distributed in other forms, such as being distributed via the Internet or wired or wireless telecommunication systems.
  • a suitable medium such as optical storage or supplied together with hardware parts
  • a computer program may also be distributed in other forms, such as being distributed via the Internet or wired or wireless telecommunication systems.
  • system/device/apparatus claim enumerating several means several of these means may be embodied by one and the same item of hardware or software. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Abstract

The invention relates three-dimensional (3D) display of an information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream. A method according to the invention comprises reading or receiving the information stream; determining an available depth range for 3D display; attributing corresponding non- overlapping depth ranges to each of the main video stream and of the at least one graphics stream; scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream. The invention enables overlaying 3D graphics onto 3D without unwanted occlusion problems.

Description

Playback and overlay of 3D graphics onto 3D video
FIELD OF THE INVENTION
The present invention relates to a method of playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream. The invention also related to an apparatus for playback of the information stream as described herein-above and to a signal comprising the information stream as described herein-above.
BACKGROUND OF THE INVENTION
With the introduction of new 3D displays, there is an opportunity for 3D video to break through to the mass consumer market. Such 3D displays are able to handle both 3D display and 2D display. Various formats for 3D video exist today and most are based on single viewpoint stereo, whereby the user can see a scene in stereo from a single viewpoint. However, introducing 3D video does not only relate to introducing new displays capable of 3D display, but it also has impact on the whole content production and delivery chain. Firstly, the production of 3D video content is at an embryonic technology stage and various formats are proposed to be used each with their own advantages and disadvantages. With respect to content distribution, new coding methods were introduced for coding 3D content and new formats were proposed to include the 3D video stream in MPEG streams.
A known fact is that introduction of new formats is usually slow and a desired feature when introducing new format is backwards playback compatibility with the installed player base. A missing area has been the carriage of 3D video content in a content distribution or publishing format such as Digital Video Broadcasting (DVB) or DVD and high definition format such as Blu-ray Disc (BD) or HD-DVD while maintaining backwards compatibility with the installed player base. An important feature high definition publishing formats is the ability of content providers to provide multiple video stream such as picture-in-picture and graphics and interactive streams. For example, in case of BD, DVD and HD-DVD it is known that such systems allow playback of video and graphics (e.g. subtitles, navigation buttons) at the same time. Usually graphic stream such as subtitles should always appear in front of the main video and therefore are added later to the final picture to be displayed.
However, when using 3D displays, the same composition order and overlaying constraints should hold, but this may not happen. In 3D displays, every pixel, belonging to either video or graphics stream, has a depth relative to the display. Such depth is either directly associated therewith if a 2D+ depth coding of the 3D streams is used, or the depth information can be directly inferred from other coding systems, such as 2D + parallax information. Nothing prevents pixels of the main video to have a depth lower (i.e. to be displayed closer to the viewer) than that of some pixels belonging to a graphic object such as the subtitles or an interactive buttons, causing occlusion of part the graphics object by parts of the background video whilst the intention of the author is that the all of the graphics appear in front of the video background.
Figure Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems. In such systems there exists a main movie plane, a presentation plane comprising static graphic objects, and an interactive plane comprising interactive objects. The three planes are overlayed on each other: the main movie plane in the background, the presentation plane on top of the main movie and the interactive plane most forward. The right image in Fig. Ia indicated the outputted image with the three planes overlayed.
Figure Ib is a 2D representation of how such planes might intertwine in case of 3D display of each stream. Due to depth, some part of the main movie plane may have a depth closer to the viewer than that of the graphics items. In such parts, the foreground graphic objects are punctured and text becomes difficult to read, while the general aspect of the displayed image is broken and unpleasing. In case of graphics streams, this is particularly problematic as the graphics may appear at any location in the video and is dependent on input from the user.
SUMMARY OF THE INVENTION
It is an object of the invention to address the above-mentioned problems of displaying both 3D video streams and 3D graphics stream on top without occlusion problems. The object of the invention is reached by a method according to claim 1 for playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream.
According to the invention, the method comprises reading or receiving the information stream; determining an available depth range for 3D display, attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream, scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream. This is based on the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assigning them to the existing presentation planes, such as the video and graphics plane, followed buy rescaling of the depth of each streams to assigned range. Hence, according to the invention, the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth). It is noted that this concept is applicable to any playback system that displays at least two overlapping 3D graphic streams or non-moving stream of pictures, such as a slideshows, or two overlapping video streams. Hence the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D graphics objects on top of 3D graphic backgrounds.
In an advantageous embodiment of the invention, the information stream further comprises overlay information for overlaying the at least one graphics stream onto the main video stream, the overlay information comprising the non-overlapping depth ranges wherein the non-overlapping depth ranges are preferably defined as depth percentages of the available depth range. When expressed as depth percentages, the range limits could be defined while authoring the content, hence giving authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles). Furthermore, a non-absolute range indication (e.g. a percentage relative to the maximum depth value of the target screen) has the advantage of compatibility with any 3D display.
In a preferred embodiment, the information stream is BD compatible and it comprises a video stream, a graphics stream and an interactive graphics stream, the interactive graphics stream being displayed in front of the graphics stream, which is displayed in front of the main video stream. In such cases, an optimal value for the depth ranges corresponds to the depth ranges to the main movie stream, graphics stream and interactive graphics stream being in the ratio 5:3:2.
The invention is also related to an apparatus for playback of an information stream suitable to be played back in on a three-dimensional (3D) display as defined in claim 7 and a signal as defined in claim 11.
These and other aspects of the invention are apparent from and will be explained with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
The features and advantages of the invention will be further explained upon reference to the following drawings, in which:
Fig. Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems, while Fig. Ib is a 2D representation of occlusion of the graphics stream by the video stream when both streams are displayed in 2D.
Fig. 2 illustrates schematically a playback device wherein the invention is practiced;
Fig. 3 illustrates schematically various presentation planes and the associated depth ranges according to an embodiment of the invention, Fig. 4 illustrates schematically an embodiment according to the invention of the real processing unit and the rendering stage.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Fig. 2 illustrates schematically a playback device wherein the invention is practiced. It is dully noted that this described a particular embodiment corresponding to playback from optical discs, but the source of the information stream is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet. The invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
Optical discs having a track, the track being the position of the series of prerecorded marks representing information, and arranged in accordance with a single spiral pattern constituting substantially parallel tracks on an information layer. The optical disc may comprise one or more information layers of a recordable type. Known examples of prerecorded optical discs are CD-ROM, or DVD-ROM or high density disc such as HD DVD- ROM or BD-ROM. For example, further details about the physical structure and addressing information for CD-ROM and DVD-ROM optical discs can be found in references ECMA- 130 and ECMA-267 (ISO IEC 16449), respectively. In case of BD systems, further details can be found in the publicly available technical white papers "Blu-ray Disc Format General August 2004 " and "Blu-ray Disc LC Physical Format Specifications for BD-ROM November, 2005", published by the Blu-Ray Disc association (http://www. hluraydisc. com) . The information is represented on the information layer by optically detectable marks along the track. The track 12 on the optical disc is indicated by a pre-embossed track structure provided during manufacture of the blank optical disc. The track structure is constituted, for example, by a pregroove, which enables a read/write head to follow the track during scanning.
The optical disc is intended for carrying user information according to a standardized format, to be playable on standardized playback devices. The recording format includes the way information is recorded, encoded and logically mapped onto the recording space provided by the track. The recordable space is usually subdivided into a lead-in area (LI) 31, a data zone (DZ) for recording the information and a lead-out area (LO). The lead-in area (LI) usually comprises basic disc management information and information how to physically access the data zone (DZ). For example, said basic disc management information corresponds to the table of contents in CD systems or the formatting disc control blocks (FDCB) in DVD systems.
The user information recorded in the data zone (DZ) is further arranged according to an application format, for example comprising a predefined structure of files and directories.
Further, at logical level, the user information in the data zone is arranged according to a file system comprising file management information, such as ISO 9660 used in CD systems, available as ECMA-119, or UDF used in DVD systems, available as ECMA- 167. The recording device is provided with scanning means for scanning the track of the optical disc, the scanning means comprising a drive unit 16 for rotating the optical disc 11, a head 18, a positioning unit 21 for coarsely positioning the head 18 in the radial direction on the track, and a control unit 17. The head 18 comprises an optical system of a known type for generating a radiation beam 20 guided through optical elements for focusing said radiation beam 20 to a radiation spot 19 on the track 12 of the optical disc 11. The radiation beam 20 is generated by a radiation source, e.g. a laser diode. The head further comprises (not shown) a focusing actuator for moving the focus of the radiation beam 20 along the optical axis of said beam and a tracking actuator for fine positioning of the radiation spot 19 in a radial direction on the center of the track. The tracking actuator may comprise coils for radially moving an optical element or may alternatively be arranged for changing the angle of a reflecting element.
For reading information, the radiation reflected by the information layer is detected by a detector of a usual type, e.g. a four-quadrant diode, in the head 18 for generating a read signal and further detector signals, such as a tracking error and a focusing error signal for controlling said tracking and focusing actuators. The control unit 17 controls the retrieving of information from the optical disc 11, and may be arranged for receiving commands from a user or from a host computer. To this end, the control unit 17 may comprise control circuitry, for example a microprocessor, a program memory and control gates, for performing the procedures described hereinafter. The control unit 17 may also be implemented as a state machine in logic circuits.
For reading, the read signal is processed by a read processing unit comprising a demodulator 26, a de-formatter 27 and output unit 28 for processing the information and outputting said information to suitable means, such as display, speakers. The functioning of the demodulator 26, the de-formatter 27 and the output unit 28 are controlled by the controller 17. Hence, retrieving means for reading information include the drive unit 16, the head 18, the positioning unit 21 and the read processing unit. The demodulator 26 is responsible for de-modulating a data signal from the channel signal, by using suitable channel decoder, e.g. as disclosed in US 5,920,272 or US 5,477,222. The de-formatter 27 is responsible for using error correction codes and/or de-interleaving for extracting the information signal from the data signal. The output unit 28, under the control of the control unit 17, is responsible for processing the information signal at logical level. Furthermore, it is noted that the information signal may be arranged according to a playback format, which may prescribe that management information is associated to the audio-video information. Hence the output unit is responsible for separating management information from the audio- video information, and for de-multiplexing and/or decoding the audio and/or video information. Suitable compression/de-compression means are described for audio in WO 98/16014-A1 (PHN 16452), and for video in the MPEG2 standard (ISO-IEC 13818). The recording format in which this the user information is to be recorded prescribes that management information for managing the recorded user information is also recorded onto the optical disc.
The video and audio information generated by the output unit 28 is sent to suitable means, such a suitable display for the video information. A number of 3D displays are known, one of them being described in US 6,069,650. The display device comprises an LCD display comprising actively switchable Liquid Crystal lenticular lens. Depending on the image content a defined set of locations at the display can be switched to either 2D or 3D mode. For content providers that intend to make use such display systems, such as movie studios, it is desirable to be able to distribute both 2D and 3D content on the same record medium in a format that is playback compatible with legacy players, that players not enabled to display 3D stream should be able to hand the record medium.
With respect to a possible video application format to which the invention is applicable, a format also known the BD video application format, is described in the following white papers, to be included herein by reference, the white paper available for download at www.bluraydisc.com:
- Technical White Paper: "Blu-ray Disc Format 2.A Logical and Audio Visual Application -Format Specifications for BD-RE August 2004";
- Technical White Paper: "Blu-ray Disc Format 2.B Logical and Audio Visual Application Format Specifications for BD-ROM March 2005" - Technical White Paper: "Application Definition Blu-ray Disc Format BD-J
Baseline Application and Logical Model Definition for BD-ROM March 2005"
In the BD system, for presentation four planes are defined: from the back, Primary video plane, Secondary video plane, Presentation Graphics plane (PG plane) and Interactive Graphics plane (IG plane) and each plane is linked to an output of the dedicated decoder. On Primary video plane, moving or still picture data from Primary video decoder is presented. On Secondary video plane, moving picture data from Secondary video decoder is presented. On Presentation Graphics plane, graphic data from either Presentation Graphics decoder or Text subtitle decoder is presented. And these data on two planes are firstly overlaid to make interim video data. Transparent ratio between the two planes is defined as alpha value in CLUT of Presentation Graphics plane. On Interactive Graphics plane, graphic data from Interactive Graphics decoder is presented and overlaid on the above interim video data to make final video output. Transparent ratio between the two data is defined as alpha value in CLUT of Interactive Graphics plane. It is noted that Figure 1 illustrated three of these planes, for clarity the secondary video plane was not shown.
With respect to the coding of video information for 3D playback, which is not addressed by known BD systems, the multiple views necessary for a 3D display can be computed based on a 2D picture and an additional picture, a so-called depth map, as described in Oliver Sheer- "JD Video Communication" , Wiley, 2005, pages 29-34. The depth map conveys information about the depth of objects in the 2D image. The grey scale values in the depth map indicate the depth of the associated pixel in the 2D image. A stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation. With respect to a format for compressing and transmitting 3D video information, a solution is to make use of MPEG streams, wherein an MPEG 3D video stream would comprise a 2D video stream (as either a one program - or a elementary video transport stream) and, multiplexed with the 2D video stream, an auxiliary stream comprising additional information to enable 3D display (such as a depth map stream).
It is noted that although in the above 2D video + depth map was described as the preferred format for implementing the invention, it is not the only format that can be supported. For example the 2D video + depth map may be extended by adding background de-occlusion information and transparency information, or stereo + depth may be used as input format. Alternatively the multiple views may be used as input signal and mapped directly onto the display (sub) pixels. In an embodiment of the invention, in order to implement 3D display for each of the BD planes, for each plane it is possible to use the format 2D + depth as previously described, i.e. the full resolution image is divided into four quadrants and one is used for the 2D content while another carries depth information. This is the format that each plane has before they are composed together into the final image that will be shown on the screen. The inventors had the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assign them to the existing presentation planes, such as the video and graphics planes. Hence, according to the invention, the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth). It is noted that this concept is applicable to any playback system that displays at least two overlapping 3D (static) stream of pictures, such as a slideshows, or two overlapping video streams. Hence the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D objects on top of 3D backgrounds.
The range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles). A non-absolute range indication (e.g. a percentage relative to the maximum depth value of the target screen) would be the best choice.
A method of playback, in a basic embodiment of the invention, wherein 3D objects are overlaid over a 3D video stream, comprises steps of:
- reading and processing the two stream, including suitable processing such as demultiplexing; - determining an available depth range for 3D display;
- attributing corresponding non-overlapping depth ranges to each of the main video stream and of the graphics stream;
- scaling the video depth information and the graphics depth information to the corresponding depth range; - using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream in the rendering stage.
In a second embodiment of the invention, illustrated in Fig. 3, this is extended to three planes, such as a video plane and two graphics planes as used in BD systems. 35, 36, and 37 indicate the relative depth of each of the Main Movie plane, Presentation plane and Interactive plane. In figure 3 depth is illustrated as increased in the opposite direction to the viewer. In particular, in an improvement of the invention, the range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles). A non absolute range indication (e.g. a percentage relative to the maximum depth value of the target screen) is best choice. A preferred choice of ranges is 50% to the main movie plane, 30% to the presentation plane and 20% to the interactive plane.
A further problem addressed by the inventors is how to provide such depth range choice needed to avoid the occlusion problems in a way that maintains backward compatibility with known systems. Within BD systems, it is known that three types of graphics segments exist - Object Definition Segments, which store the bitmap values of a certain graphics object: - Palette Definition Segments which provide the mapping between those values and real colours - Presentation and Interactive Composition Segments which provide information on the way in which the current graphics element should be added to the graphics plane.
When implementing 3D objects, it is expected that two extra types of segments are used, namely Depth Map Object Definition Segment and Depth Map Palette Segment exist along with the previous ones. According to an embodiment of the invention, a new data field, called depth_percentage, is added to the Composition Segment structure. Since Composition Segments also hold a reference to the Depth map Palette Segment to be used with a certain graphics object, this allows rescaling that depth map according to the expressed percentage. Therefore this enables the association of a different portion of depth to different graphics planes, for example 20% to subtitles and 30% of the whole depth to interactive menus.
According to an alternative embodiment of the invention, the same effect may be achieved if the depth_percentage field directly is included into the Depth Map Palette Segment definition.
With respect to the implementation of the invention in a playback device, the following are noted:
The information stream processed by the output unit 28 is provided to a video processing unit 31, responsible for implementing the known function of the player model, such as buffering, demultiplexing, processing each elementary stream, executing received commands. The video processing unit 31 is usually implemented as a combination of software and hardware. The processed video information stream is provided to a rendering unit, which is responsible to processing the video information into a signal suitable for 3D display. It is noted that the rendering unit may also be implemented in the 3D display itself. Furthermore it is known the function of the control unit 17, of the video processing unit 31 and of the rendering stage 32 may be implemented in a device by the same hardware and/or software block.
According to the invention, the control unit 17 is adapted to determine an available depth range for the 3D display and for attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream and for scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range while the rendering unit adapted to use the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) rendering of the information stream. In a specific embodiment depth determination means 29 are provided in the control unit, preferably implemented as firmware or as embedded software.
According to the invention, two additional function need to be performed within the functional block comprising the video processing unit 31 and the rendering stage 32, a specific embodiment of which is illustrated in more detail in Fig 4. The received stream is demultiplexed and buffered in unit 41, the graphic stream sent to the Stream graphic processor 42. For generating the graphic images to be displayed for each of the right and left view, two buffers 43 and 44, under the control of a graphic controller are provided. The two buffers 43 and 44 supply the two graphics plane processors 45 and 46.
According to the invention, both graphics decoders (43, 45 and, respectively, 44, 46) are adapted to take into account the value of depth_percentage present in the Composition Segments. The depth map palette has to be adapted to the depth_percentage value, i.e. the minimum and maximum depth contained in the palette have to be within that percentage of the whole possible depth values.
This guarantees that Presentation and Interactive planes will have a depth according to the corresponding depth_percentage values. When overlaying all the planes together, the apparatus ensures that the depth of the video planes does not exceed their percentage (i.e. the total possible depth span minus the sum of the percentage of the two graphics planes).
Furthermore, it is noted that BD and HD-DVD also support a secondary video plane for PIP. For PIP the video in the PIP may appear side by side with the main video or in a quarter of the screen. The primary video cannot be scaled so the secondary video always covers part of the primary video. Another known constraint is that the secondary video is either fully transparent or fully opaque. So it is advantageous that, when the secondary video is fully opaque, the primary video does not punch through the other video during rendering of the 2D +depth information in the display, in a similar way as what can happen with the graphics.
According to the invention, an alternative solution is possible in the case of overlaying two video streams. According to the invention, in the player the primary video and the secondary video are combined both for the 2D and for the depth information. The data from the secondary video plane simply overwrites the pixels of the primary video on the same presentation plane completely, Such solution is possible in view of the fact that there is no semi-transparency and there is no strong requirement for occlusion information of the primary video that is concealed by the secondary video. This in contrast to overlaying a graphics stream, as the graphics stream may be semi-transparent and may overlay the video stream in any shape and location.
Finally it is noted that exemplary embodiments of the invention hereinabove were done with reference to a playback device for playback of information from an optical disc. It is noted that the source of the information is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet. The invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
This invention can be summarized as follows: the invention relates three- dimensional (3D) display of an information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three- dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three- dimensional (3D) display of the at least one graphics stream. A method according to the invention comprises reading or receiving the information stream; determining an available depth range for 3D display; attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream; scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream. The invention enables overlaying 3D graphics onto 3D without unwanted occlusion problems.
It should be noted that the above-mentioned embodiments are meant to illustrate rather than limit the invention. And that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verbs "comprise" and "include" and their conjugations do not exclude the presence of elements or steps other than those stated in a claim. The article "a" or an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements and by means of a suitable programmed computer. A computer program may be stored/distributed on a suitable medium, such as optical storage or supplied together with hardware parts, but may also be distributed in other forms, such as being distributed via the Internet or wired or wireless telecommunication systems. In a system/device/apparatus claim enumerating several means, several of these means may be embodied by one and the same item of hardware or software. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A method of playback of an information stream suitable to played back in on a three-dimensional (3D) display, the information stream comprising
- a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream;
- at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream; the method comprising
- reading or receiving the information stream;
- determining an available depth range for 3D display;
- attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream; - scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range
- using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream.
2. A method according to claim 1, wherein the information stream further comprises overlay information for overlaying the at least one graphics stream onto the main video stream, the overlay information comprising the non-overlapping depth ranges.
3. A method according to claims 1 or 2, wherein the non-overlapping depth ranges are defined as depth percentages of the available depth range.
4. A method according to claims 2 or 3, wherein the information stream comprises a graphics stream and an interactive graphics stream.
5. A method according to claim 4, wherein the depth ranges are chosen such that the interactive graphics stream is displayed in from of the graphics stream, which is displayed in front of the main video stream.
6. A method according to claim 5, wherein depth ranges to the main movie stream, graphics stream and interactive graphics stream are in the ration 5:3:2.
7. An apparatus for playback of an information stream suitable to played back in on a three-dimensional (3D) display, the information stream comprising
- a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream;
- at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream; the apparatus comprising
- input unit for reading or receiving the information stream;
- control unit for determining an available depth range for 3D display and for attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream and for scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range a rendering unit adapted to use the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) rendering of the information stream.
8. An apparatus according to claim 7, wherein the control unit is enabled to defined the non-overlapping depth ranges as depth percentages of the available depth range.
9. An apparatus according to claim 8, wherein the information stream comprises a graphics stream and an interactive graphics stream.
10. An apparatus according to claim 9, wherein the control unit is enabled to choose the depth ranges such that the interactive graphics stream is displayed in from of the graphics stream, which is displayed in front of the main video stream.
11. A signal comprising an information stream suitable to played back in on a three-dimensional (3D) display, the information stream comprising
- a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; - at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream; the information stream further comprises overlay information for overlaying the at least one graphics stream onto the main video stream, the overlay information comprising non-overlapping depth ranges associated with each of the main video stream and of the at least one graphics stream.
12. A signal according to claim 11 wherein, the non-overlapping depth ranges are defined as depth percentages.
13. A record carrier comprising recorded thereon a signal according to claims 11 and 12.
PCT/IB2008/055338 2007-12-20 2008-12-16 Playback and overlay of 3d graphics onto 3d video WO2009083863A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP07123796 2007-12-20
EP07123796.0 2007-12-20

Publications (1)

Publication Number Publication Date
WO2009083863A1 true WO2009083863A1 (en) 2009-07-09

Family

ID=40394513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/055338 WO2009083863A1 (en) 2007-12-20 2008-12-16 Playback and overlay of 3d graphics onto 3d video

Country Status (2)

Country Link
TW (1) TW200935873A (en)
WO (1) WO2009083863A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010058362A1 (en) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Extending 2d graphics in a 3d gui
WO2010058368A1 (en) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
EP2309463A2 (en) 2009-10-07 2011-04-13 Thomson Licensing Method of displaying a 3D video with insertion of a graphic item and terminal for implementing the method
EP2309765A1 (en) * 2009-09-11 2011-04-13 Disney Enterprises, Inc. System and method for three-dimensional video capture workflow for dynamic rendering
EP2320667A1 (en) * 2009-10-20 2011-05-11 Koninklijke Philips Electronics N.V. Combining 3D video auxiliary data
EP2337368A1 (en) * 2009-08-18 2011-06-22 Sony Corporation Reproducing device, reproducing method, data structure, recording medium, recording device, recording method, and program
CN102164257A (en) * 2010-02-05 2011-08-24 Lg电子株式会社 An electronic device and a method for providing a graphical user interface (gui) for broadcast information
EP2467831A2 (en) * 2009-08-17 2012-06-27 Samsung Electronics Co., Ltd. Method and apparatus for processing signal for three-dimensional reproduction of additional data
EP2495979A1 (en) 2011-03-01 2012-09-05 Thomson Licensing Method, reproduction apparatus and system for display of stereoscopic 3D video information
EP2502424A2 (en) * 2009-11-16 2012-09-26 LG Electronics Inc. Image display apparatus and operating method thereof
EP2525580A3 (en) * 2011-05-20 2013-05-15 EchoStar Technologies L.L.C. Dynamically configurable 3D display
CN103155577A (en) * 2010-10-01 2013-06-12 三星电子株式会社 Display device, signal-processing device, and methods therefor
EP2312859A3 (en) * 2009-10-13 2013-06-26 Broadcom Corporation Method and system for communicating 3D video via a wireless communication link
WO2013120742A1 (en) * 2012-02-13 2013-08-22 Thomson Licensing Method and device for inserting a 3d graphics animation in a 3d stereo content
EP2630803A2 (en) * 2010-10-18 2013-08-28 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
US8786673B2 (en) 2011-01-07 2014-07-22 Cyberlink Corp. Systems and methods for performing video conversion based on non-linear stretch information
EP2453661A4 (en) * 2009-07-10 2016-03-30 Panasonic Ip Man Co Ltd Recording medium, reproducing device, and integrated circuit
US9600923B2 (en) 2011-05-26 2017-03-21 Thomson Licensing Scale-independent maps
US9699438B2 (en) 2010-07-02 2017-07-04 Disney Enterprises, Inc. 3D graphic insertion for live action stereoscopic video
US10791314B2 (en) 2010-03-31 2020-09-29 Interdigital Ce Patent Holdings, Sas 3D disparity maps

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI462567B (en) 2010-06-18 2014-11-21 Realtek Semiconductor Corp Three dimensional processing circuit and processing method
IT1401367B1 (en) 2010-07-28 2013-07-18 Sisvel Technology Srl METHOD TO COMBINE REFERENCE IMAGES TO A THREE-DIMENSIONAL CONTENT.
US9571811B2 (en) 2010-07-28 2017-02-14 S.I.Sv.El. Societa' Italiana Per Lo Sviluppo Dell'elettronica S.P.A. Method and device for multiplexing and demultiplexing composite images relating to a three-dimensional content
US20150245063A1 (en) * 2012-10-09 2015-08-27 Nokia Technologies Oy Method and apparatus for video coding
TWI510071B (en) * 2013-09-18 2015-11-21 Vivotek Inc Pre-processing method for video data playback and playback interface apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905988A1 (en) * 1997-09-30 1999-03-31 Kabushiki Kaisha Toshiba Three-dimensional image display apparatus
JP2004274125A (en) * 2003-03-05 2004-09-30 Sony Corp Image processing apparatus and method
WO2008038205A2 (en) * 2006-09-28 2008-04-03 Koninklijke Philips Electronics N.V. 3 menu display
WO2008115222A1 (en) * 2007-03-16 2008-09-25 Thomson Licensing System and method for combining text with three-dimensional content

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905988A1 (en) * 1997-09-30 1999-03-31 Kabushiki Kaisha Toshiba Three-dimensional image display apparatus
JP2004274125A (en) * 2003-03-05 2004-09-30 Sony Corp Image processing apparatus and method
WO2008038205A2 (en) * 2006-09-28 2008-04-03 Koninklijke Philips Electronics N.V. 3 menu display
WO2008115222A1 (en) * 2007-03-16 2008-09-25 Thomson Licensing System and method for combining text with three-dimensional content

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010058368A1 (en) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
WO2010058362A1 (en) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Extending 2d graphics in a 3d gui
EP2453661A4 (en) * 2009-07-10 2016-03-30 Panasonic Ip Man Co Ltd Recording medium, reproducing device, and integrated circuit
EP2467831A4 (en) * 2009-08-17 2013-04-17 Samsung Electronics Co Ltd Method and apparatus for processing signal for three-dimensional reproduction of additional data
EP2467831A2 (en) * 2009-08-17 2012-06-27 Samsung Electronics Co., Ltd. Method and apparatus for processing signal for three-dimensional reproduction of additional data
CN103024412A (en) * 2009-08-18 2013-04-03 索尼公司 Reproducing apparatus reproducing method, recording apparatus and recording method
US8488950B2 (en) 2009-08-18 2013-07-16 Sony Corporation Reproducing apparatus and reproducing method, data structure, recording medium, recording apparatus and recording method, and program
EP2337368A4 (en) * 2009-08-18 2013-06-12 Sony Corp Reproducing device, reproducing method, data structure, recording medium, recording device, recording method, and program
EP2337368A1 (en) * 2009-08-18 2011-06-22 Sony Corporation Reproducing device, reproducing method, data structure, recording medium, recording device, recording method, and program
CN103024412B (en) * 2009-08-18 2015-10-28 索尼公司 Reproducer and reproducting method and recording equipment and recording method
EP2309765A1 (en) * 2009-09-11 2011-04-13 Disney Enterprises, Inc. System and method for three-dimensional video capture workflow for dynamic rendering
US8614737B2 (en) 2009-09-11 2013-12-24 Disney Enterprises, Inc. System and method for three-dimensional video capture workflow for dynamic rendering
WO2011042479A1 (en) 2009-10-07 2011-04-14 Thomson Licensing Method of displaying a 3d video with insertion of a graphic item and terminal for implementing the method
EP2309463A3 (en) * 2009-10-07 2011-07-27 Thomson Licensing Method of displaying a 3D video with insertion of a graphic item and terminal for implementing the method
EP2309463A2 (en) 2009-10-07 2011-04-13 Thomson Licensing Method of displaying a 3D video with insertion of a graphic item and terminal for implementing the method
EP2312859A3 (en) * 2009-10-13 2013-06-26 Broadcom Corporation Method and system for communicating 3D video via a wireless communication link
EP2320667A1 (en) * 2009-10-20 2011-05-11 Koninklijke Philips Electronics N.V. Combining 3D video auxiliary data
EP2502424A2 (en) * 2009-11-16 2012-09-26 LG Electronics Inc. Image display apparatus and operating method thereof
EP2502424A4 (en) * 2009-11-16 2014-08-27 Lg Electronics Inc Image display apparatus and operating method thereof
EP2355495A3 (en) * 2010-02-05 2012-05-30 Lg Electronics Inc. An electronic device and a method for providing a graphical user interface (gui) for broadcast information
CN102164257A (en) * 2010-02-05 2011-08-24 Lg电子株式会社 An electronic device and a method for providing a graphical user interface (gui) for broadcast information
US10791314B2 (en) 2010-03-31 2020-09-29 Interdigital Ce Patent Holdings, Sas 3D disparity maps
US9699438B2 (en) 2010-07-02 2017-07-04 Disney Enterprises, Inc. 3D graphic insertion for live action stereoscopic video
EP2624571A2 (en) * 2010-10-01 2013-08-07 Samsung Electronics Co., Ltd Display device, signal-processing device, and methods therefor
EP2624571A4 (en) * 2010-10-01 2014-06-04 Samsung Electronics Co Ltd Display device, signal-processing device, and methods therefor
CN103155577A (en) * 2010-10-01 2013-06-12 三星电子株式会社 Display device, signal-processing device, and methods therefor
EP2630803A2 (en) * 2010-10-18 2013-08-28 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
EP2630803A4 (en) * 2010-10-18 2014-10-08 Silicon Image Inc Combining video data streams of differing dimensionality for concurrent display
US8786673B2 (en) 2011-01-07 2014-07-22 Cyberlink Corp. Systems and methods for performing video conversion based on non-linear stretch information
US9547928B2 (en) 2011-03-01 2017-01-17 Thomson Licensing Method and apparatus for authoring stereoscopic 3D video information, and method and apparatus for displaying such stereoscopic 3D video information
EP2495979A1 (en) 2011-03-01 2012-09-05 Thomson Licensing Method, reproduction apparatus and system for display of stereoscopic 3D video information
WO2012116900A1 (en) 2011-03-01 2012-09-07 Thomson Licensing Method and apparatus for authoring stereoscopic 3d video information, and method and apparatus for displaying such stereoscopic 3d video information
US8923686B2 (en) 2011-05-20 2014-12-30 Echostar Technologies L.L.C. Dynamically configurable 3D display
EP2525580A3 (en) * 2011-05-20 2013-05-15 EchoStar Technologies L.L.C. Dynamically configurable 3D display
US9600923B2 (en) 2011-05-26 2017-03-21 Thomson Licensing Scale-independent maps
US9685006B2 (en) 2012-02-13 2017-06-20 Thomson Licensing Dtv Method and device for inserting a 3D graphics animation in a 3D stereo content
WO2013120742A1 (en) * 2012-02-13 2013-08-22 Thomson Licensing Method and device for inserting a 3d graphics animation in a 3d stereo content

Also Published As

Publication number Publication date
TW200935873A (en) 2009-08-16

Similar Documents

Publication Publication Date Title
WO2009083863A1 (en) Playback and overlay of 3d graphics onto 3d video
US9338428B2 (en) 3D mode selection mechanism for video playback
US11277600B2 (en) Switching between 3D video and 2D video
JP5859309B2 (en) Combination of 3D video and auxiliary data
RU2520325C2 (en) Data medium and reproducing device for reproducing 3d images
RU2522304C2 (en) Reproducing device, recording method, recording medium reproducing system
CA2691727C (en) Recording medium, playback device, system lsi, playback method, glasses, and display device for 3d images
KR20070014963A (en) Recording medium, method and appratus for reproducing data and method and apparatus for recording data
KR101596832B1 (en) / / recording medium data recording/reproducing method and data recording/reproducing apparatus
EP2320667A1 (en) Combining 3D video auxiliary data
KR101537615B1 (en) Recording medium, data recording/reproducing method, and data recording/reproducing apparatus
KR101648450B1 (en) Data reproducing method, and reproducing apparatus
KR20080033404A (en) Recording medium, method and apparatus for reproducing data and method and apparatus for recording data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08867777

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08867777

Country of ref document: EP

Kind code of ref document: A1