US20140126886A1 - Synchronized stream packing - Google Patents
Synchronized stream packing Download PDFInfo
- Publication number
- US20140126886A1 US20140126886A1 US14/155,969 US201414155969A US2014126886A1 US 20140126886 A1 US20140126886 A1 US 20140126886A1 US 201414155969 A US201414155969 A US 201414155969A US 2014126886 A1 US2014126886 A1 US 2014126886A1
- Authority
- US
- United States
- Prior art keywords
- audio
- video
- packets
- picture
- sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/30—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
- G11B27/3027—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/23424—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234318—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2365—Multiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2368—Multiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2381—Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4341—Demultiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4347—Demultiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4348—Demultiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
- H04N21/4381—Recovering the multiplex stream from a specific network, e.g. recovering MPEG packets from ATM cells
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/44016—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440209—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for formatting on an optical medium, e.g. DVD
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
- H04N21/4884—Data services, e.g. news ticker for displaying subtitles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2562—DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
Definitions
- the present invention relates generally to Digital Versatile Discs, previously known as Digital Video Discs (DVDs), High Definition Digital Versatile Discs (HD DVD), and Blu-Ray Disc (BD), and more particularly to a technique for facilitating synchronization among the sub-streams of different audio/visual (A/V) streams embedded on a DVD, HD DVD, or BD.
- DVDs Digital Video Discs
- HD DVD High Definition Digital Versatile Discs
- BD Blu-Ray Disc
- the DVD, HD DVD and Blu-ray specifications currently define mechanisms for seamlessly switching among multiple parallel A/V streams.
- the audio and sub-picture content of the streams is restricted to be bit-for-bit identical among all of the streams. This prevents any potential damage to audio speakers that could result from signal spikes caused by differences in the audio data from one A/V stream to another, and also reduces the restrictions regarding organization of such data within each multiplexed A/V stream.
- these restrictions also greatly limit the range of applications for which the seamless multi-angle feature may be used.
- Present day DVDs, HD DVDs, and BDs typically include at least one, and usually several A/V streams in parallel synchronism to each other. Often such A/V streams include different recordings of the same scene shot from a different angle. Hence, such different A/V streams are often referred to as “angles”. Selection of different angles (i.e., different streams) occurs through a process known as “multi-angle navigation” whereby a viewer selects a desired angle by selecting an associated icon on a display screen.
- multi-angle video The DVD, HD DVD, and BD specifications adopted by the manufacturers of these discs and associated playback devices define a process known as “multi-angle video” whereby a content author can define as many as nine concurrent A/V streams, any one of which can appear on a display screen at any time.
- the viewer can switch seamlessly among a set of synchronized A/V streams by actuating a command via a button on a DVD, HD DVD, or BD player or on the remote control device for such player; this form of multi-angle navigation is known as seamless multi-angle.
- audio and sub-picture data stored in each A/V stream remains identical. That is, only different video data is allowed between angles.
- Sub-picture data describes the rendering of buttons, subtitles, and other graphical elements displayed over video. This results both in an inability to automatically present different audio and sub-picture content when a parallel A/V stream is selected and also leads to redundant copies of audio and sub-picture data being stored on the delivery medium, limiting space for other content.
- A/V streams are constituted at a basic level of data packets for the sub-streams (audio, video, and sub-picture) which are joined together in short units which, when read sequentially, comprise the presented stream.
- these fundamental data units are known as Video Object Units, or VOBUs, and each include about 0.4 to 1 second of presentation data.
- VOBUs Video Object Units
- EVOBUs EVOBUs
- the terms VOBUs and EVOBUs may be used interchangeably herein for illustrative purposes.
- each stream collects one or more VOBUs into an Interleave Unit, or ILVU, which are synchronized with ILVUs for other parallel A/V streams based on the video presentation time.
- ILVU Interleave Unit
- the data from the current ILVU plays until the end of the ILVU and the ILVU for the new stream is presented seamlessly at that time. In this way, seamless presentation of video is assured.
- BD refers to a similar combination of packets using different terminology, namely Transport Stream (TS).
- TS Transport Stream
- BD does not limit the duration of presentation data in the unit, using instead of ILVUs, angle change points in each TS to mark points at which streams can be changed while ensuring video continuity.
- Audio, video, and sub-picture packets in VOBUs, TS, RTP or other packetized multimedia formats are all typically marked with a first type of timestamp indicating when they should be delivered for decoding and a second type of timestamp indicating when they should be presented.
- the delivery timestamps are encoded in the “system_clock_reference” as defined in ISO/IEC13818-1.
- delivery timestamps are typically called “arrival_timestamps” as defined in some of the specifications derived from ISO/IEC 13818-I.
- arrivaltimestamp collectively refers to the delivery timestamp in VOBUs and TSs.
- the presentation timestamps are the usual PTSs as defined in ISO/IEC13818-I.
- non-video packets in a single VOBU may not all refer to similar presentation times.
- an audio packet may refer to presentation time 8
- a video packet may refer to presentation time 4, the audio packet for presentation time 4 having been delivered from a previous VOBU.
- audio and sub-picture/subtitle data are identical between VOBUs in ILVUs (or between TSs) for different A/V streams in a parallel presentation, switching ILVUs or TSs has no effect on audio, sub-picture/subtitle, and video synchronization or correspondence/synchronization.
- audio and sub-picture data packets differ between VOBUs or TSs for different A/V streams
- a case could occur where audio or sub-picture/subtitle packets corresponding to the presentation time of the video from the new VOBU or TS have already been delivered from a previous VOBU or TS, resulting in audio or sub-picture/subtitle presentation that, while presented at the proper time, is out of correspondence/synchronization with the current context.
- a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation includes the step of identifying sub-picture/subtitle packets and/or audio packets having arrival timestamps and/or presentation timestamps that match an arrival timestamp and/or a presentation timestamp, respectively, of video packets.
- the method also includes the step of packing a Video Object Unit (VOBU) and/or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching timestamps.
- VOBU Video Object Unit
- TS Transport Stream
- an apparatus for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation includes means for identifying sub-picture/subtitle packets and/or audio packets having arrival timestamps and/or presentation timestamps that match an arrival timestamp and/or a presentation timestamp, respectively, of video packets.
- the apparatus also includes means for packing a Video Object Unit (VOBU) and/or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching timestamps.
- VOBU Video Object Unit
- TS Transport Stream
- a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation includes the step of packing an audio frame header into an audio packet at, a beginning of a first Video Object Unit (VOBU) in an InterLeaVe Unit (ILVU), or, an angle change point marker in a Transport Stream (TS).
- the method also includes the step of packing a last audio packet, in a last VOBU in the ILVU or another ILVU in a same one of the plurality of A/V streams, or, immediately prior to another angle change point marker in the TS, so as to conclude with a complete audio frame.
- VOBU Video Object Unit
- ILVU InterLeaVe Unit
- TS Transport Stream
- FIG. 1 is a block diagram illustrating a DVD player to which the present invention may be applied, in accordance with an illustrative embodiment thereof;
- FIG. 2 is a flow diagram illustrating a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation in accordance with the present principles
- FIG. 3 is a flow diagram illustrating a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation in accordance with the present principles
- FIG. 4 is a flow diagram illustrating a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation in accordance with the present principles
- FIG. 5 is a block diagram illustrating the relationship among an audio/visual stream, Video Object Units (VOBUs) and an Interleave Units (ILVUs)
- VOBUs Video Object Units
- IVSUs Interleave Units
- the present invention is directed to synchronized stream packing.
- a method is provided for constraining the organization of audio and sub-picture packets within multiplexed streams (e.g., MPEG program and transport streams) in order to allow seamless switching among multiple interleaved audio/video (A/V) presentations in which the audio content and/or sub-picture/subtitle content is different.
- multiplexed streams e.g., MPEG program and transport streams
- processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage.
- DSP digital signal processor
- ROM read-only memory
- RAM random access memory
- any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
- the invention as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
- a Digital Versatile Disc (DVD) player 10 to which the present invention may be applied is indicated generally by the reference numeral 10 .
- the DVD player 10 includes a drive motor 12 that rotates a DVD 13 under the control of a servomechanism 14 .
- a pick-up head motor 16 also controlled by the servomechanism 14 , serves to displace an optical pick-up head 18 across the DVD 13 to read information carried thereby.
- a pre-amplifier 20 amplifies the output signal of the pick-up head 18 for input to a decoder 22 that decodes the optical information read from the DVD 13 to yield a program stream.
- a de-multiplexer 24 de-multiplexes the program stream into separate components: (a) an audio stream; (b) a video stream; (c) a sub-picture stream; and (d) navigation information, typically in the form of metadata or the like.
- the audio, video, and sub-picture streams undergo decoding by a separate one of the audio decoder 26 , video decoder 28 and sub-picture decoder 30 , respectively.
- a synchronizer 32 sometimes known as a presentation engine, serves to synchronize and combine the separately decoded audio, video and sub-picture streams into a video stream, with embedded audio for suitable reproduction in accordance with one of several known television formats including, but not limited to NTSC or PAL.
- a video digital-to-analog converter 34 converts the video stream into analog video for display on a display device (not shown) such as a television set, while an audio digital-to-analog-converter 36 converts the embedded audio to analog audio for subsequent reproduction by the display device or by other means (not shown).
- a central processing unit (CPU) 38 typically in the form of a microprocessor with associated memory, or a microcomputer or microcontroller, serves to control navigation, as well as other aspects of the DVD player, in accordance with viewer commands entered through a viewer interface (U/I) 40 , typically comprising the combination of an Infrared (I/R) transmitter, in the form of remote control, and an I/R receiver.
- the CPU 38 receives decoded metadata from the demultiplexer 24 and generates menu information for receipt by the synchronizer 32 . In this way, the menu information ultimately undergoes display for viewing by the viewer.
- the viewer typically will enter one or more commands through the U/I 40 for receipt by the CPU 38 , which in turn, controls the servomechanism 14 to displace the pick-up head 18 to retrieve the desired program content.
- the DVD specification (DVD Specifications for Read-Only Disc/Part 3. VIDEO SPECIFICATIONS, Version 1.0, August 1996), defines the smallest object to which DVD navigation can apply as a Video Object Unit (VOBU).
- the VOBU typically includes multiplexed video, audio, sub-picture, highlight and other navigation data, corresponding to a playback duration of about 0.4 to 1.2 seconds.
- Multiple sub-streams of audio and sub-picture data can exist in each VOBU (e.g., stereo and surround sound audio sub-streams and/or German and Portuguese subtitles).
- A/V stream This combination of such multiplexed data constitutes an “A/V stream.”
- multiple A/V streams are interleaved together into a single Video Object (VOB) stream in order to allow quick access from one stream to another for seamless or near-seamless switching.
- VOB Video Object
- the DVD specification defines an Interleave Unit (ILVU) as a block of one or more VOBUs in order to align the A/V stream content of multiple angles with a common time stamp, providing synchronization of the A/V streams.
- the synchronizer 32 decodes and displays only the ILVUs corresponding to the currently selected A/V stream.
- the DVD specification defines a maximum size of the ILVU based on number of angles (i.e., number of available streams), scan speed of the physical device, and size of the decode buffer (not shown). If this maximum size is exceeded, seamless playback of any angle cannot be guaranteed.
- A/V Audio/Visual
- Video Object Units VOBUs
- Transport Streams TSs
- sub-picture/subtitle typically have no innate frame rate, instead their frame rate is usually somehow derived or related to the video frame rate.
- VOBUs or TSs should include sub-picture/subtitle and audio packets whose presentation timestamps match the presentation timestamp of the video packets (within one unit of time reference of the sub-picture/subtitle or audio packet, respectively). If VOBUs or TSs are packed in this way, both synchronization and contextual correspondence between audio, sub-picture/subtitle, and video data is maintained where audio or sub-picture/subtitle data differs contextually between VOBUs or TSs for different A/V streams.
- a further issue is the potential corruption of audio or sub-picture/subtitle data when an ILVU for a new A/V stream is presented, as audio or sub-picture data packets at the beginning of the first VOBU in that ILVU (or at the angle change point marker of a TS) may be fragmented, and unable to be decoded until a subsequent, whole, packet occurs.
- the audio data packet at the start of the first VOBU in an ILVU should include an audio frame header
- the last audio packet in the last VOBU in an ILVU should include a complete audio frame, i.e., no audio frame fragmentation should occur across any ILVU boundary (or across any angle change point marker).
- sub-picture/subtitle data must start with a Sub-Picture Unit (SPU) header or an Epoch start header.
- SPU Sub-Picture Unit
- FIG. 2 a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation is indicated generally by the reference numeral 200 .
- the method 200 includes a start block 205 that passes control to a function block 210 .
- the function block 210 identifies sub-picture/subtitle packets and/or audio packets whose arrival timestamps match an arrival timestamp of the video packets, and passes control to a function block 220 .
- the function block 220 packs a Video Object Unit (VOBU) or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching arrival timestamps, and passes control to an end block 225 .
- the end block 225 terminates the method.
- VOBU Video Object Unit
- TS Transport Stream
- FIG. 3 a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation is indicated generally by the reference numeral 300 .
- the method 300 includes a start block 305 that passes control to a function block 310 .
- the function block 310 identifies sub-picture/subtitle packets and/or audio packets whose presentation timestamps match a presentation timestamp of the video packets, and passes control to a function block 320 .
- the function block 320 packs a Video Object Unit (VOBU) or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching presentation timestamps, and passes control to an end block 325 .
- VOBU Video Object Unit
- TS Transport Stream
- FIG. 4 a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation is indicated generally by the reference numeral 400 .
- the method 400 includes a start block 405 that passes control to a function block 410 .
- the function block 410 packs an audio frame header into an audio packet at a beginning of a first Video Object Unit (VOBU) in an InterLeaVe Unit (ILVU), or packs an audio frame header into an audio packet at an angle change point marker in a Transport Stream (TS), and passes control to a function block 420 .
- VOBU Video Object Unit
- IMVU InterLeaVe Unit
- TS Transport Stream
- the function block 420 packs a last audio packet in a last VOBU in the ILVU (or in another ILVU in the same A/V stream), or packs a last audio packet immediately prior to another angle change point marker in the TS, so as to conclude with a complete audio frame (audio frame fragmentation is non-existent across any ILVU boundaries or angle change markers), and passes control to a function block 430 .
- the function block 430 packs sub-picture/subtitle packets to start with a Sub-Picture Unit (SPU) header or an Epoch start header, and passes control to an end block 435 .
- the end block 435 terminates the method.
- SPU Sub-Picture Unit
- each block of the program stream decoded by the decoder 22 of FIG. 1 includes a navigation packet (NV_PCK), a video packet (V_PCK), an audio packet (A_PCK) and a sub-picture packet (SP_PCK).
- the DVD specification defines a Seamless Angle Information data structure (SML_AGLI) in the navigation data structure (DSI) portion of the NV_PCK at the beginning of each VOBU that includes a table of ILVU start points indicating the location where the next ILVU for each seamless angle is located.
- SML_AGLI Seamless Angle Information data structure
- DSI navigation data structure
- Such information enables the CPU 38 of FIG. 1 to control the servomechanism 14 where to go within the VOB stream when it is ready to begin presenting the next ILVU.
- the DVD specification defines several data structures within a portion of the navigation data at the beginning of each VOBU that describe the Highlight Information (HLI) for interactive buttons.
- These data structures such as the Highlight General Information (HLI_GI), Button Color Information Table (BTN_COLIT), and Button Information Table (BTN_IT) define the number, position, appearance, and function of the buttons that appear in the screen display.
- HLI_GI Highlight General Information
- BTN_COLIT Button Color Information Table
- BTN_IT Button Information Table
- teachings of the present invention are implemented as a combination of hardware and software.
- the software is preferably implemented as an application program tangibly embodied on a program storage unit.
- the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
- the various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Computer Security & Cryptography (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Television Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
There are provided methods and apparatus for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation. A method includes the step of identifying sub-picture/subtitle packets and/or audio packets having arrival timestamps and/or presentation timestamps that match an arrival timestamp and/or a presentation timestamp, respectively, of video packets. The method also includes the step of packing a Video Object Unit (VOBU) and/or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching timestamps.
Description
- This application claims the benefit of U.S. Provisional Application Ser. No. 60/674,767, filed Apr. 26, 2006, which is incorporated by reference herein in its entirety.
- The present invention relates generally to Digital Versatile Discs, previously known as Digital Video Discs (DVDs), High Definition Digital Versatile Discs (HD DVD), and Blu-Ray Disc (BD), and more particularly to a technique for facilitating synchronization among the sub-streams of different audio/visual (A/V) streams embedded on a DVD, HD DVD, or BD.
- The DVD, HD DVD and Blu-ray specifications currently define mechanisms for seamlessly switching among multiple parallel A/V streams. However, in each case, the audio and sub-picture content of the streams is restricted to be bit-for-bit identical among all of the streams. This prevents any potential damage to audio speakers that could result from signal spikes caused by differences in the audio data from one A/V stream to another, and also reduces the restrictions regarding organization of such data within each multiplexed A/V stream. However, these restrictions also greatly limit the range of applications for which the seamless multi-angle feature may be used.
- The development of the DVD followed the development of the Compact Disk (CD) in an effort to achieve sufficient storage capacity for large video files to enable a single disc to carry a full length motion picture, albeit compressed using a compression technique such as the Moving Picture Expert Group compression (MPEG) technique. Since its first introduction in the mid 1990s, the DVD has proliferated, becoming the preferred medium of choice for wide scale distribution of motion picture and video content to consumers. Similar optical disc formats for delivery of higher quality and greater amounts of audiovisual content have been developed as planned successors to DVD. Two of the most prominent formats are known as HD DVD and BD.
- Present day DVDs, HD DVDs, and BDs typically include at least one, and usually several A/V streams in parallel synchronism to each other. Often such A/V streams include different recordings of the same scene shot from a different angle. Hence, such different A/V streams are often referred to as “angles”. Selection of different angles (i.e., different streams) occurs through a process known as “multi-angle navigation” whereby a viewer selects a desired angle by selecting an associated icon on a display screen. The DVD, HD DVD, and BD specifications adopted by the manufacturers of these discs and associated playback devices define a process known as “multi-angle video” whereby a content author can define as many as nine concurrent A/V streams, any one of which can appear on a display screen at any time. During playback, the viewer can switch seamlessly among a set of synchronized A/V streams by actuating a command via a button on a DVD, HD DVD, or BD player or on the remote control device for such player; this form of multi-angle navigation is known as seamless multi-angle. However, under known format specifications and implementations of currently available DVD, HD DVD, and BD authoring tools, audio and sub-picture data stored in each A/V stream remains identical. That is, only different video data is allowed between angles. Sub-picture data describes the rendering of buttons, subtitles, and other graphical elements displayed over video. This results both in an inability to automatically present different audio and sub-picture content when a parallel A/V stream is selected and also leads to redundant copies of audio and sub-picture data being stored on the delivery medium, limiting space for other content.
- A/V streams are constituted at a basic level of data packets for the sub-streams (audio, video, and sub-picture) which are joined together in short units which, when read sequentially, comprise the presented stream. In DVD-Video, these fundamental data units are known as Video Object Units, or VOBUs, and each include about 0.4 to 1 second of presentation data. In HD DVD-Video, these are known as EVOBUs. The terms VOBUs and EVOBUs may be used interchangeably herein for illustrative purposes. When multiple A/V streams are presented in parallel, each stream collects one or more VOBUs into an Interleave Unit, or ILVU, which are synchronized with ILVUs for other parallel A/V streams based on the video presentation time. Thus, when a new stream is selected, the data from the current ILVU plays until the end of the ILVU and the ILVU for the new stream is presented seamlessly at that time. In this way, seamless presentation of video is assured.
- BD refers to a similar combination of packets using different terminology, namely Transport Stream (TS). BD does not limit the duration of presentation data in the unit, using instead of ILVUs, angle change points in each TS to mark points at which streams can be changed while ensuring video continuity.
- Audio, video, and sub-picture packets in VOBUs, TS, RTP or other packetized multimedia formats are all typically marked with a first type of timestamp indicating when they should be delivered for decoding and a second type of timestamp indicating when they should be presented. In the case of VOBUs, the delivery timestamps are encoded in the “system_clock_reference” as defined in ISO/IEC13818-1. In the case of Transport Streams (TSs), delivery timestamps are typically called “arrival_timestamps” as defined in some of the specifications derived from ISO/IEC 13818-I. As used herein, the term “arrivaltimestamp” collectively refers to the delivery timestamp in VOBUs and TSs. The presentation timestamps are the usual PTSs as defined in ISO/IEC13818-I.
- Due to different buffering models and decoder designs, non-video packets in a single VOBU (or at an angle change point marker in a TS) may not all refer to similar presentation times. For example, an audio packet may refer to presentation time 8, whereas a video packet may refer to presentation time 4, the audio packet for presentation time 4 having been delivered from a previous VOBU. When audio and sub-picture/subtitle data are identical between VOBUs in ILVUs (or between TSs) for different A/V streams in a parallel presentation, switching ILVUs or TSs has no effect on audio, sub-picture/subtitle, and video synchronization or correspondence/synchronization. However, when audio and sub-picture data packets differ between VOBUs or TSs for different A/V streams, a case could occur where audio or sub-picture/subtitle packets corresponding to the presentation time of the video from the new VOBU or TS have already been delivered from a previous VOBU or TS, resulting in audio or sub-picture/subtitle presentation that, while presented at the proper time, is out of correspondence/synchronization with the current context.
- Thus, there exists a need for a method of storing data in a way that audio and sub-picture data are contextually different in parallel, synchronized A/V streams playing from any one of these optical disc formats and also maintain stream continuity as well as synchronization with video data as the viewer interactively selects different A/V streams during the presentation.
- These and other drawbacks and disadvantages of the prior art are addressed by the present invention, which is directed to synchronized stream packing.
- According to an aspect of the present invention, there is provided a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation. The method includes the step of identifying sub-picture/subtitle packets and/or audio packets having arrival timestamps and/or presentation timestamps that match an arrival timestamp and/or a presentation timestamp, respectively, of video packets. The method also includes the step of packing a Video Object Unit (VOBU) and/or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching timestamps.
- According to yet another aspect of the present invention, there is provided an apparatus for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation. The apparatus includes means for identifying sub-picture/subtitle packets and/or audio packets having arrival timestamps and/or presentation timestamps that match an arrival timestamp and/or a presentation timestamp, respectively, of video packets. The apparatus also includes means for packing a Video Object Unit (VOBU) and/or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching timestamps.
- According to a further aspect of the present invention, there is provided a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation. The method includes the step of packing an audio frame header into an audio packet at, a beginning of a first Video Object Unit (VOBU) in an InterLeaVe Unit (ILVU), or, an angle change point marker in a Transport Stream (TS). The method also includes the step of packing a last audio packet, in a last VOBU in the ILVU or another ILVU in a same one of the plurality of A/V streams, or, immediately prior to another angle change point marker in the TS, so as to conclude with a complete audio frame.
- These and other aspects, features and advantages of the present invention will become apparent from the following detailed description of exemplary embodiments, which is to be read in connection with the accompanying drawings.
- The present invention may be better understood in accordance with the following exemplary figures, in which:
-
FIG. 1 is a block diagram illustrating a DVD player to which the present invention may be applied, in accordance with an illustrative embodiment thereof; -
FIG. 2 is a flow diagram illustrating a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation in accordance with the present principles; -
FIG. 3 is a flow diagram illustrating a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation in accordance with the present principles; -
FIG. 4 is a flow diagram illustrating a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation in accordance with the present principles; and -
FIG. 5 is a block diagram illustrating the relationship among an audio/visual stream, Video Object Units (VOBUs) and an Interleave Units (ILVUs) - The present invention is directed to synchronized stream packing. In accordance with an embodiment, a method is provided for constraining the organization of audio and sub-picture packets within multiplexed streams (e.g., MPEG program and transport streams) in order to allow seamless switching among multiple interleaved audio/video (A/V) presentations in which the audio content and/or sub-picture/subtitle content is different.
- The present description illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the invention and are included within its spirit and scope.
- All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
- Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
- Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
- The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage.
- Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The invention as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
- Turning to
FIG. 1 , a Digital Versatile Disc (DVD)player 10 to which the present invention may be applied is indicated generally by thereference numeral 10. TheDVD player 10 includes adrive motor 12 that rotates aDVD 13 under the control of aservomechanism 14. A pick-uphead motor 16, also controlled by theservomechanism 14, serves to displace an optical pick-uphead 18 across theDVD 13 to read information carried thereby. Apre-amplifier 20 amplifies the output signal of the pick-uphead 18 for input to adecoder 22 that decodes the optical information read from theDVD 13 to yield a program stream. A de-multiplexer 24 de-multiplexes the program stream into separate components: (a) an audio stream; (b) a video stream; (c) a sub-picture stream; and (d) navigation information, typically in the form of metadata or the like. - The audio, video, and sub-picture streams undergo decoding by a separate one of the
audio decoder 26,video decoder 28 andsub-picture decoder 30, respectively. Asynchronizer 32, sometimes known as a presentation engine, serves to synchronize and combine the separately decoded audio, video and sub-picture streams into a video stream, with embedded audio for suitable reproduction in accordance with one of several known television formats including, but not limited to NTSC or PAL. A video digital-to-analog converter 34 converts the video stream into analog video for display on a display device (not shown) such as a television set, while an audio digital-to-analog-converter 36 converts the embedded audio to analog audio for subsequent reproduction by the display device or by other means (not shown). - Within the
DVD player 10, a central processing unit (CPU) 38, typically in the form of a microprocessor with associated memory, or a microcomputer or microcontroller, serves to control navigation, as well as other aspects of the DVD player, in accordance with viewer commands entered through a viewer interface (U/I) 40, typically comprising the combination of an Infrared (I/R) transmitter, in the form of remote control, and an I/R receiver. Specifically with regard to navigation, theCPU 38 receives decoded metadata from thedemultiplexer 24 and generates menu information for receipt by thesynchronizer 32. In this way, the menu information ultimately undergoes display for viewing by the viewer. In response to the displayed information, the viewer typically will enter one or more commands through the U/I 40 for receipt by theCPU 38, which in turn, controls theservomechanism 14 to displace the pick-uphead 18 to retrieve the desired program content. - The DVD specification (DVD Specifications for Read-Only Disc/
Part 3. VIDEO SPECIFICATIONS, Version 1.0, August 1996), defines the smallest object to which DVD navigation can apply as a Video Object Unit (VOBU). The VOBU typically includes multiplexed video, audio, sub-picture, highlight and other navigation data, corresponding to a playback duration of about 0.4 to 1.2 seconds. Multiple sub-streams of audio and sub-picture data can exist in each VOBU (e.g., stereo and surround sound audio sub-streams and/or German and Portuguese subtitles). This combination of such multiplexed data constitutes an “A/V stream.” In a multi-angle segment, multiple A/V streams are interleaved together into a single Video Object (VOB) stream in order to allow quick access from one stream to another for seamless or near-seamless switching. - The DVD specification defines an Interleave Unit (ILVU) as a block of one or more VOBUs in order to align the A/V stream content of multiple angles with a common time stamp, providing synchronization of the A/V streams. During playback, the
synchronizer 32 decodes and displays only the ILVUs corresponding to the currently selected A/V stream. The DVD specification defines a maximum size of the ILVU based on number of angles (i.e., number of available streams), scan speed of the physical device, and size of the decode buffer (not shown). If this maximum size is exceeded, seamless playback of any angle cannot be guaranteed. - In accordance with an embodiment, there is provided a method for the storage of sub-picture/subtitle and/or audio data within at least one of a plurality of audio-visual streams presented in parallel in order to maintain synchronization between sub-picture/subtitle, audio, and video data as well as provide continuity between such data as different Audio/Visual (A/V) streams are selected during a presentation.
- To ensure a constant synchronization and correspondence with video of audio and sub-picture/subtitle packets which differ contextually between A/V streams in a parallel presentation, Video Object Units (VOBUs) or Transport Streams (TSs) should include sub-picture/subtitle and audio packets whose arrival timestamps match the arrival timestamp of the video packets (within one unit of time reference of the sub-picture/subtitle or audio packet, respectively). It is to be appreciated that sub-picture/subtitle typically have no innate frame rate, instead their frame rate is usually somehow derived or related to the video frame rate. The same rule applies to the presentation timestamps, VOBUs or TSs should include sub-picture/subtitle and audio packets whose presentation timestamps match the presentation timestamp of the video packets (within one unit of time reference of the sub-picture/subtitle or audio packet, respectively). If VOBUs or TSs are packed in this way, both synchronization and contextual correspondence between audio, sub-picture/subtitle, and video data is maintained where audio or sub-picture/subtitle data differs contextually between VOBUs or TSs for different A/V streams.
- A further issue is the potential corruption of audio or sub-picture/subtitle data when an ILVU for a new A/V stream is presented, as audio or sub-picture data packets at the beginning of the first VOBU in that ILVU (or at the angle change point marker of a TS) may be fragmented, and unable to be decoded until a subsequent, whole, packet occurs.
- To resolve this issue, the audio data packet at the start of the first VOBU in an ILVU (or at an angle change point marker of a TS) should include an audio frame header, and the last audio packet in the last VOBU in an ILVU (or the last audio packet immediately prior to an angle change point marker in a TS) should include a complete audio frame, i.e., no audio frame fragmentation should occur across any ILVU boundary (or across any angle change point marker). Similarly sub-picture/subtitle data must start with a Sub-Picture Unit (SPU) header or an Epoch start header.
- Turning to
FIG. 2 , a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation is indicated generally by thereference numeral 200. - The
method 200 includes astart block 205 that passes control to afunction block 210. Thefunction block 210 identifies sub-picture/subtitle packets and/or audio packets whose arrival timestamps match an arrival timestamp of the video packets, and passes control to afunction block 220. - The
function block 220 packs a Video Object Unit (VOBU) or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching arrival timestamps, and passes control to anend block 225. Theend block 225 terminates the method. - Turning to
FIG. 3 , a method for synchronized stream packing of packets that differ contextually between A/V streams in a parallel presentation is indicated generally by thereference numeral 300. - The
method 300 includes astart block 305 that passes control to afunction block 310. Thefunction block 310 identifies sub-picture/subtitle packets and/or audio packets whose presentation timestamps match a presentation timestamp of the video packets, and passes control to afunction block 320. Thefunction block 320 packs a Video Object Unit (VOBU) or a Transport Stream (TS) with the identified sub-picture/subtitle and audio packets and the video packets having the matching presentation timestamps, and passes control to anend block 325. Theend block 325 terminates the method. - Turning to
FIG. 4 , a method for presenting a different A/V stream from among a plurality of A/V streams that differ contextually in a parallel presentation is indicated generally by thereference numeral 400. - The
method 400 includes astart block 405 that passes control to afunction block 410. Thefunction block 410 packs an audio frame header into an audio packet at a beginning of a first Video Object Unit (VOBU) in an InterLeaVe Unit (ILVU), or packs an audio frame header into an audio packet at an angle change point marker in a Transport Stream (TS), and passes control to afunction block 420. - The
function block 420 packs a last audio packet in a last VOBU in the ILVU (or in another ILVU in the same A/V stream), or packs a last audio packet immediately prior to another angle change point marker in the TS, so as to conclude with a complete audio frame (audio frame fragmentation is non-existent across any ILVU boundaries or angle change markers), and passes control to afunction block 430. - The
function block 430 packs sub-picture/subtitle packets to start with a Sub-Picture Unit (SPU) header or an Epoch start header, and passes control to anend block 435. Theend block 435 terminates the method. - Turning to
FIG. 5 , the relationship of multiplexed A/V stream data to VOBU and ILVU data structures for multi-angle video is indicated generally by thereference numeral 500. As illustrated inFIG. 5 , each block of the program stream decoded by thedecoder 22 ofFIG. 1 includes a navigation packet (NV_PCK), a video packet (V_PCK), an audio packet (A_PCK) and a sub-picture packet (SP_PCK). The DVD specification defines a Seamless Angle Information data structure (SML_AGLI) in the navigation data structure (DSI) portion of the NV_PCK at the beginning of each VOBU that includes a table of ILVU start points indicating the location where the next ILVU for each seamless angle is located. Such information enables theCPU 38 ofFIG. 1 to control theservomechanism 14 where to go within the VOB stream when it is ready to begin presenting the next ILVU. - In addition, the DVD specification defines several data structures within a portion of the navigation data at the beginning of each VOBU that describe the Highlight Information (HLI) for interactive buttons. These data structures, such as the Highlight General Information (HLI_GI), Button Color Information Table (BTN_COLIT), and Button Information Table (BTN_IT) define the number, position, appearance, and function of the buttons that appear in the screen display.
- These and other features and advantages of the present invention may be readily ascertained by one of ordinary skill in the pertinent art based on the teachings herein. It is to be understood that the teachings of the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or combinations thereof.
- Most preferably, the teachings of the present invention are implemented as a combination of hardware and software. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
- It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying drawings are preferably implemented in software, the actual connections between the system components or the process function blocks may differ depending upon the manner in which the present invention is programmed. Given the teachings herein, one of ordinary skill in the pertinent art will be able to contemplate these and similar implementations or configurations of the present invention.
- Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the present invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Claims (2)
1-10. (canceled)
11. A method, comprising:
packing an audio frame header into an audio packet at an angle change point in a Transport Stream (TS);
packing a last audio packet immediately prior to another angle change point in the TS, so as to conclude with a complete audio frame; and
packing a subtitle packet to start with an Epoch start header.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/155,969 US20140126886A1 (en) | 2005-04-26 | 2014-01-15 | Synchronized stream packing |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US67476705P | 2005-04-26 | 2005-04-26 | |
US91851107A | 2007-10-15 | 2007-10-15 | |
US14/155,969 US20140126886A1 (en) | 2005-04-26 | 2014-01-15 | Synchronized stream packing |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US91851107A Continuation | 2005-04-26 | 2007-10-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140126886A1 true US20140126886A1 (en) | 2014-05-08 |
Family
ID=36685590
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/918,511 Active 2031-07-16 US9167220B2 (en) | 2005-04-26 | 2006-03-16 | Synchronized stream packing |
US14/155,956 Abandoned US20140126885A1 (en) | 2005-04-26 | 2014-01-15 | Synchronized stream packing |
US14/155,969 Abandoned US20140126886A1 (en) | 2005-04-26 | 2014-01-15 | Synchronized stream packing |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/918,511 Active 2031-07-16 US9167220B2 (en) | 2005-04-26 | 2006-03-16 | Synchronized stream packing |
US14/155,956 Abandoned US20140126885A1 (en) | 2005-04-26 | 2014-01-15 | Synchronized stream packing |
Country Status (7)
Country | Link |
---|---|
US (3) | US9167220B2 (en) |
EP (2) | EP2860732A1 (en) |
JP (8) | JP5116664B2 (en) |
KR (2) | KR20080016999A (en) |
CN (2) | CN101902628B (en) |
MY (2) | MY155199A (en) |
WO (1) | WO2006115606A2 (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2860732A1 (en) | 2005-04-26 | 2015-04-15 | Thomson Licensing | Synchronized stream packing |
KR100905723B1 (en) * | 2006-12-08 | 2009-07-01 | 한국전자통신연구원 | System and Method for Digital Real Sense Transmitting/Receiving based on Non-Realtime |
US9794605B2 (en) * | 2007-06-28 | 2017-10-17 | Apple Inc. | Using time-stamped event entries to facilitate synchronizing data streams |
JP4882989B2 (en) * | 2007-12-10 | 2012-02-22 | ソニー株式会社 | Electronic device, reproduction method and program |
US9060187B2 (en) * | 2008-12-22 | 2015-06-16 | Netflix, Inc. | Bit rate stream switching |
JP2010239288A (en) * | 2009-03-30 | 2010-10-21 | Sony Corp | Information processing device and method |
EP2863393B1 (en) * | 2010-08-04 | 2018-10-17 | Nero Ag | Multi-language buffering during media playback |
KR20120035406A (en) * | 2010-10-05 | 2012-04-16 | 삼성전자주식회사 | Method and apparatus for playing moving picuture files |
KR101885852B1 (en) | 2011-09-29 | 2018-08-08 | 삼성전자주식회사 | Method and apparatus for transmitting and receiving content |
DE112013003222B4 (en) | 2012-06-29 | 2021-08-19 | Denso Corporation | Semiconductor device and semiconductor device connection structure |
CN103179435B (en) * | 2013-02-27 | 2016-09-28 | 北京视博数字电视科技有限公司 | A kind of multi-path video data multiplexing method and device |
CN106162323A (en) * | 2015-03-26 | 2016-11-23 | 无锡天脉聚源传媒科技有限公司 | A kind of video data handling procedure and device |
CN109040779B (en) * | 2018-07-16 | 2019-11-26 | 腾讯科技(深圳)有限公司 | Caption content generation method, device, computer equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5784528A (en) * | 1995-09-29 | 1998-07-21 | Matsushita Electric Industrial Co. Ltd. | Method and an apparatus for interleaving bitstream to record thereof on a recording medium, and reproducing the interleaved bitstream therefrom |
US6285825B1 (en) * | 1997-12-15 | 2001-09-04 | Matsushita Electric Industrial Co., Ltd. | Optical disc, recording apparatus, a computer-readable storage medium storing a recording program, and a recording method |
US6363208B2 (en) * | 1997-03-19 | 2002-03-26 | Sony Corporation | Digital signal reproduction method and apparatus |
WO2005004478A1 (en) * | 2003-07-03 | 2005-01-13 | Matsushita Electric Industrial Co., Ltd. | Recording medium, reproduction apparatus, recording method, integrated circuit, program, and reproduction method |
US7315690B2 (en) * | 1995-04-11 | 2008-01-01 | Kabushiki Kaisha Toshiba | Recording medium, recording apparatus and recording method for recording data into recording medium, and reproducing apparatus and reproducing method for reproducing data from recording medium |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5915066A (en) | 1995-02-16 | 1999-06-22 | Kabushiki Kaisha Toshiba | Output control system for switchable audio channels |
TW303570B (en) * | 1995-09-29 | 1997-04-21 | Matsushita Electric Ind Co Ltd | |
TW385431B (en) | 1995-09-29 | 2000-03-21 | Matsushita Electric Ind Co Ltd | A method and an apparatus for encoding a bitstream with plural possible searching reproduction paths information useful in multimedia optical disk |
TW303569B (en) | 1995-09-29 | 1997-04-21 | Matsushita Electric Ind Co Ltd | |
JP3375619B1 (en) * | 1995-09-29 | 2003-02-10 | 松下電器産業株式会社 | Bit stream generation method and information recording method |
TW305043B (en) | 1995-09-29 | 1997-05-11 | Matsushita Electric Ind Co Ltd | |
TW436777B (en) * | 1995-09-29 | 2001-05-28 | Matsushita Electric Ind Co Ltd | A method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween |
US6484266B2 (en) * | 1995-09-29 | 2002-11-19 | Matsushita Electric Industrial Co., Ltd. | Method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween |
TW335480B (en) * | 1995-09-29 | 1998-07-01 | Matsushita Electric Ind Co Ltd | Method and apparatus for encoding a bistream for multi-angle connection |
US5838678A (en) * | 1996-07-24 | 1998-11-17 | Davis; Joseph W. | Method and device for preprocessing streams of encoded data to facilitate decoding streams back-to back |
WO1998013769A1 (en) * | 1996-09-27 | 1998-04-02 | Matsushita Electric Industrial Co., Ltd. | Method of generating multimedia stream which enables selective reproduction of video data and multimedia optical disk authoring system |
CN1192028A (en) | 1996-11-13 | 1998-09-02 | 松下电器产业株式会社 | System flow replaying control information editor and its method and record media |
JP3791114B2 (en) | 1997-04-30 | 2006-06-28 | ソニー株式会社 | Signal reproducing apparatus and method |
JPH10312648A (en) * | 1997-05-13 | 1998-11-24 | Sanyo Electric Co Ltd | Data packet train generating method, data transmitting method using the same, data accumulating method and mpeg picture transmitting system |
US6580870B1 (en) * | 1997-11-28 | 2003-06-17 | Kabushiki Kaisha Toshiba | Systems and methods for reproducing audiovisual information with external information |
US7031348B1 (en) | 1998-04-04 | 2006-04-18 | Optibase, Ltd. | Apparatus and method of splicing digital video streams |
TW439054B (en) * | 1998-04-08 | 2001-06-07 | Matsushita Electric Ind Co Ltd | Optical disc, optical disc recording method and apparatus, and optical disc reproducing method and apparatus |
WO2000030113A1 (en) | 1998-11-16 | 2000-05-25 | Koninklijke Philips Electronics N.V. | Method and device for recording real-time information |
EP1057184B1 (en) | 1998-11-16 | 2016-04-27 | Koninklijke Philips N.V. | Method and device for recording real-time information |
JP2000152179A (en) | 1998-11-17 | 2000-05-30 | Pioneer Electronic Corp | Video data reproducing method, video data reproducing device, video data recording method and video data recorder |
US6865747B1 (en) * | 1999-04-01 | 2005-03-08 | Digital Video Express, L.P. | High definition media storage structure and playback mechanism |
JP2000298918A (en) | 1999-04-14 | 2000-10-24 | Alpine Electronics Inc | Disk reproducing device |
JP2000339933A (en) | 1999-05-31 | 2000-12-08 | Kenwood Corp | Recording medium reproducing system |
AU1452801A (en) | 1999-11-10 | 2001-06-06 | Thomson Licensing S.A. | Fading feature for a dvd recorder |
US6567086B1 (en) * | 2000-07-25 | 2003-05-20 | Enroute, Inc. | Immersive video system using multiple video streams |
JP3578069B2 (en) * | 2000-09-13 | 2004-10-20 | 日本電気株式会社 | Long-term image / sound compression apparatus and method |
US20020131761A1 (en) * | 2001-01-16 | 2002-09-19 | Kojiro Kawasaki | Information recording medium, apparatus and method for recording/reproducing information to/from the medium |
TW579506B (en) * | 2001-03-05 | 2004-03-11 | Matsushita Electric Ind Co Ltd | A recording device and a recording method |
US7274862B2 (en) * | 2001-09-27 | 2007-09-25 | Sony Corporation | Information processing apparatus |
JP4197230B2 (en) | 2002-02-13 | 2008-12-17 | パイオニア株式会社 | FORMAT CONVERSION DEVICE, FORMAT CONVERSION METHOD, FORMAT CONVERSION PROCESSING PROGRAM, RECORDING MEDIUM CONTAINING FORMAT CONVERSION PROCESSING PROGRAM, AND INFORMATION RECORDING DEVICE, INFORMATION RECORDING METHOD, INFORMATION RECORDING PROCESSING PROGRAM, AND RECORDING MEDIUM CONTAINING INFORMATION RECORDING PROCESSING PROGRAM |
JP2003249057A (en) | 2002-02-26 | 2003-09-05 | Toshiba Corp | Enhanced navigation system using digital information medium |
RU2004130855A (en) * | 2002-03-20 | 2005-04-10 | Конинклейке Филипс Электроникс Н.В. (Nl) | METHOD AND DEVICE FOR RECORDING REAL-TIME INFORMATION |
JP2004079055A (en) | 2002-08-14 | 2004-03-11 | Toshiba Corp | Optical disk device, optical disk processing method, and optical disk |
JP3954473B2 (en) * | 2002-10-01 | 2007-08-08 | パイオニア株式会社 | Information recording medium, information recording apparatus and method, information reproducing apparatus and method, information recording / reproducing apparatus and method, computer program for recording or reproduction control, and data structure including control signal |
AU2003277571A1 (en) | 2002-11-11 | 2004-06-03 | Sony Corporation | Information processing device and method, program storage medium, recording medium, and program |
JP2004253842A (en) * | 2003-02-18 | 2004-09-09 | Matsushita Electric Ind Co Ltd | Video data converting method and video data reproducing method |
US7693394B2 (en) * | 2003-02-26 | 2010-04-06 | Lg Electronics Inc. | Recording medium having data structure for managing reproduction of data streams recorded thereon and recording and reproducing methods and apparatuses |
WO2004082272A1 (en) | 2003-03-10 | 2004-09-23 | Pioneer Corporation | Information recording medium, information recording device and method, information reproduction device and method, information recording/reproduction device and method, recording or reproduction control computer program, and data structure containing control signal |
WO2004090364A1 (en) | 2003-04-08 | 2004-10-21 | Honda Motor Co. Ltd. | Constant velocity joint and method of manufacturing the same |
EP1634290A1 (en) | 2003-06-05 | 2006-03-15 | Zootech Limited | Scrambled video streams in an audiovisual product |
GB0312874D0 (en) * | 2003-06-05 | 2003-07-09 | Zoo Digital Group Plc | Controlling access to an audiovisual product |
JP3944122B2 (en) * | 2003-06-05 | 2007-07-11 | 株式会社東芝 | Information recording medium, information recording method, information recording apparatus, information reproducing method, and information reproducing apparatus |
US6858475B2 (en) | 2003-06-30 | 2005-02-22 | Intel Corporation | Method of forming an integrated circuit substrate |
CN100477774C (en) * | 2003-06-30 | 2009-04-08 | 松下电器产业株式会社 | Data processing device and data processing method |
KR20050018315A (en) * | 2003-08-05 | 2005-02-23 | 삼성전자주식회사 | Information storage medium of storing information for downloading text subtitle, method and apparatus for reproducing subtitle |
KR20050018314A (en) * | 2003-08-05 | 2005-02-23 | 삼성전자주식회사 | Information storage medium of storing subtitle data and video mapping data information, reproducing apparatus and method thereof |
JP4057980B2 (en) * | 2003-08-26 | 2008-03-05 | 株式会社東芝 | Optical disc apparatus, optical disc reproducing method, and optical disc |
KR100619008B1 (en) * | 2003-09-23 | 2006-08-31 | 삼성전자주식회사 | Information storage medium storing multi-angle data, and reproducing method and apparatus thereof |
JP4178400B2 (en) | 2003-09-29 | 2008-11-12 | 日本電気株式会社 | Program storing / reproducing system, program storing / reproducing method, program |
JP4594766B2 (en) * | 2004-03-10 | 2010-12-08 | パナソニック株式会社 | Authoring system, program, authoring method. |
JP2004335102A (en) | 2004-07-22 | 2004-11-25 | Matsushita Electric Ind Co Ltd | Optical disk player |
EP2860732A1 (en) * | 2005-04-26 | 2015-04-15 | Thomson Licensing | Synchronized stream packing |
-
2006
- 2006-03-16 EP EP20140186408 patent/EP2860732A1/en not_active Ceased
- 2006-03-16 US US11/918,511 patent/US9167220B2/en active Active
- 2006-03-16 EP EP06738626A patent/EP1875740A2/en not_active Ceased
- 2006-03-16 WO PCT/US2006/009588 patent/WO2006115606A2/en active Application Filing
- 2006-03-16 KR KR1020077024330A patent/KR20080016999A/en active Search and Examination
- 2006-03-16 KR KR1020137025249A patent/KR101491684B1/en active IP Right Grant
- 2006-03-16 JP JP2008508844A patent/JP5116664B2/en active Active
- 2006-03-16 CN CN201010218387XA patent/CN101902628B/en active Active
- 2006-03-16 CN CN2006800138015A patent/CN101164347B/en active Active
- 2006-04-24 MY MYPI20061863A patent/MY155199A/en unknown
- 2006-04-24 MY MYPI20093313A patent/MY155202A/en unknown
-
2011
- 2011-12-27 JP JP2011284588A patent/JP5687614B2/en active Active
-
2012
- 2012-11-26 JP JP2012257241A patent/JP5419241B2/en active Active
- 2012-11-26 JP JP2012257242A patent/JP5419242B2/en active Active
-
2013
- 2013-04-23 JP JP2013089933A patent/JP5649195B2/en active Active
- 2013-04-23 JP JP2013089934A patent/JP5700589B2/en active Active
-
2014
- 2014-01-15 US US14/155,956 patent/US20140126885A1/en not_active Abandoned
- 2014-01-15 US US14/155,969 patent/US20140126886A1/en not_active Abandoned
- 2014-03-11 JP JP2014047209A patent/JP5939585B2/en active Active
-
2015
- 2015-05-12 JP JP2015097045A patent/JP6226432B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7315690B2 (en) * | 1995-04-11 | 2008-01-01 | Kabushiki Kaisha Toshiba | Recording medium, recording apparatus and recording method for recording data into recording medium, and reproducing apparatus and reproducing method for reproducing data from recording medium |
US5784528A (en) * | 1995-09-29 | 1998-07-21 | Matsushita Electric Industrial Co. Ltd. | Method and an apparatus for interleaving bitstream to record thereof on a recording medium, and reproducing the interleaved bitstream therefrom |
US6363208B2 (en) * | 1997-03-19 | 2002-03-26 | Sony Corporation | Digital signal reproduction method and apparatus |
US6285825B1 (en) * | 1997-12-15 | 2001-09-04 | Matsushita Electric Industrial Co., Ltd. | Optical disc, recording apparatus, a computer-readable storage medium storing a recording program, and a recording method |
WO2005004478A1 (en) * | 2003-07-03 | 2005-01-13 | Matsushita Electric Industrial Co., Ltd. | Recording medium, reproduction apparatus, recording method, integrated circuit, program, and reproduction method |
Also Published As
Publication number | Publication date |
---|---|
JP2013102437A (en) | 2013-05-23 |
CN101164347B (en) | 2010-08-25 |
MY155199A (en) | 2015-09-15 |
JP5419241B2 (en) | 2014-02-19 |
JP5649195B2 (en) | 2015-01-07 |
JP6226432B2 (en) | 2017-11-08 |
KR101491684B1 (en) | 2015-02-11 |
JP5700589B2 (en) | 2015-04-15 |
MY155202A (en) | 2015-09-15 |
CN101902628A (en) | 2010-12-01 |
US20140126885A1 (en) | 2014-05-08 |
WO2006115606A2 (en) | 2006-11-02 |
JP2013085260A (en) | 2013-05-09 |
JP5939585B2 (en) | 2016-06-22 |
EP1875740A2 (en) | 2008-01-09 |
JP2013176135A (en) | 2013-09-05 |
EP2860732A1 (en) | 2015-04-15 |
WO2006115606A3 (en) | 2007-01-25 |
JP2008539656A (en) | 2008-11-13 |
JP5419242B2 (en) | 2014-02-19 |
JP5116664B2 (en) | 2013-01-09 |
CN101164347A (en) | 2008-04-16 |
KR20080016999A (en) | 2008-02-25 |
JP2014139858A (en) | 2014-07-31 |
US20090067813A1 (en) | 2009-03-12 |
US9167220B2 (en) | 2015-10-20 |
JP2015167064A (en) | 2015-09-24 |
JP2012147427A (en) | 2012-08-02 |
CN101902628B (en) | 2012-05-02 |
JP5687614B2 (en) | 2015-03-18 |
JP2013176134A (en) | 2013-09-05 |
KR20130113540A (en) | 2013-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9167220B2 (en) | Synchronized stream packing | |
JP6840278B2 (en) | Reproduction device and reproduction method | |
JP2009224024A (en) | Program recording device and program recording method | |
JP6811833B2 (en) | Playback method and playback device | |
CN111276170B (en) | Decoding system and decoding method | |
JP2008193203A (en) | Digital video information data generation apparatus, digital video information recording apparatus, digital video information reproducing apparatus, and digital video information data generation method | |
CN110675895B (en) | Reproducing method, reproducing apparatus, and recording medium | |
JP6991279B2 (en) | Decoder system and decoding method | |
JP2008141693A (en) | Content reproducing apparatus and content reproduction method | |
JP4095221B2 (en) | Apparatus and method for reproducing multi-scene recording medium | |
WO2003065715A1 (en) | Audio/video data recording/reproduction apparatus, system, and method, recording medium recorded by them, audio/video data reproduction apparatus, and data structure | |
WO2016079925A1 (en) | Recording medium, playback method, and playback device | |
JP2005276438A (en) | Information recording medium and replica disk |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THOMSON LICENSISNG, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TECHNICOLOR, INC;REEL/FRAME:034706/0530 Effective date: 20140113 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |