US20110066929A1 - System and method for providing information of selectable objects in a still image file and/or data stream - Google Patents
System and method for providing information of selectable objects in a still image file and/or data stream Download PDFInfo
- Publication number
- US20110066929A1 US20110066929A1 US12/881,031 US88103110A US2011066929A1 US 20110066929 A1 US20110066929 A1 US 20110066929A1 US 88103110 A US88103110 A US 88103110A US 2011066929 A1 US2011066929 A1 US 2011066929A1
- Authority
- US
- United States
- Prior art keywords
- user
- still image
- information
- selectable object
- data set
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0308—Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234318—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2389—Multiplex stream processing, e.g. multiplex stream encrypting
- H04N21/23892—Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
- H04N21/2408—Monitoring of the upstream path of the transmission network, e.g. client requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25808—Management of client data
- H04N21/25841—Management of client data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2668—Creating a channel for a dedicated end-user group, e.g. insertion of targeted commercials based on end-user profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
- H04N21/42209—Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4524—Management of client data or end-user data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
- H04N21/4725—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/47805—Electronic banking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/47815—Electronic shopping
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4782—Web browsing, e.g. WebTV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4826—End-user interface for program selection using recommendation lists, e.g. of programs or channels sorted out according to their score
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4828—End-user interface for program selection for searching program descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8173—End-user applications, e.g. Web browser, game
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8545—Content authoring for generating interactive applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/858—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42222—Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
Definitions
- FIG. 1 is a diagram illustrating an exemplary media system, in accordance with various aspects of the present invention.
- FIG. 2 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention.
- FIG. 3 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention.
- FIG. 4 is a flow diagram illustrating an exemplary method for providing information of selectable objects in a still image in an information stream independent of the still image, in accordance with various aspects of the present invention.
- FIG. 5 is a diagram illustrating an exemplary media system, in accordance with various aspects of the present invention.
- FIG. 6 is a diagram illustrating exemplary modules and/or sub-modules for a media system, in accordance with various aspects of the present invention.
- modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware).
- modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such.
- various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory).
- processors e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.
- software instructions e.g., stored in volatile and/or non-volatile memory
- aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- ASIC application-specific integrated circuit
- any or all of the functional modules discussed herein may share various hardware and/or software components.
- any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions.
- various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
- a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, personal computer device, media presentation system, image presentation system, camera, media server, image server, television, television control device, television provider, television programming provider, television receiver, video and/or image recording device, etc.) may communicate with other systems.
- a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), a general data communication network (e.g., the Internet), any home or premises communication network, etc.
- a particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- Such a pointing location refers to a location on a video screen (e.g., a computer display, a display of a portable electronic device, a display of a digital photograph display, a television display, a primary television screen, a secondary television screen, etc.) to which a user (either directly or with a pointing device) is pointing.
- a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing.
- Various aspects of the present invention while referring to on-screen pointing location, are also readily extensible to such other forms of on-screen location identification.
- Such television programming may, for example, communicate still images.
- Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded television programming, broadcast/multicast/unicast television programming, etc.).
- Such television programming may, for example, comprise real-time television broadcast programming (or multicast or unicast television programming) and/or user-stored television programming that is stored in a user device (e.g., a VCR, PVR, etc.).
- Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a graphical video game, etc.).
- a television screen e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a graphical video game, etc.
- Such still images may, for example, comprise pictures.
- such still images may correspond to still photographs (e.g., taken with a digital camera), scanned images created by a scanner, facsimile images, etc.
- Such still images may, for example, be represented in a data file (e.g., a JPEG file, a bitmap, a TIFF file, etc.), or other data structure, and may be communicated in one or more streams of data.
- Such user-selectable objects includes both animate (i.e., living) and inanimate (i.e., non-living) objects.
- Such objects may, for example, comprise characteristics of any of a variety of objects present in still images.
- Such objects may, for example and without limitation, comprise inanimate objects, such as consumer good objects (e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.), premises objects (e.g., business locations, stores, hotels, signs, doors, buildings, landmarks, historical sites, entertainment venues, hospitals, government buildings, etc.), objects related to services (e.g., objects related to transportation, objects related to emergency services, objects related to general government services, objects related to entertainment services, objects related to food and/or drink services, etc.), objects related to location (e.g., parks, landmarks, streets, signs, road signs, etc.), etc.
- consumer good objects e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.
- premises objects e.g., business locations, stores
- Such objects may, for example, comprise animate objects, such as people (e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.), animals (e.g., pets, zoo animals, wild animals, etc.) and plants (e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.).
- people e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.
- animals e.g., pets, zoo animals, wild animals, etc.
- plants e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.
- the exemplary system 100 includes a media information provider 110 .
- the media information provider 110 may, for example, comprise a server associated with a media company, a cable company, an image-providing company, movie-providing company, a news company, an educational institution, etc.
- the media information provider 110 may, for example, be an original source of still images (or related information).
- the media information provider 110 may be a communication company that provides still image distribution services (e.g., a cable media company, a satellite media company, a telecommunication company, a data network provider, etc.).
- the media information provider 110 may, for example, provide both media (e.g., still image) information and non-media information.
- the media information provider 110 may, for example, provide information of both moving picture information and still picture information.
- the media information provider 110 may, for example, provide information related to a still image (e.g., information describing or otherwise related to user-selectable objects in still images, etc.). As will be discussed below in more detail, the media information provider 110 may operate to create and/or communicate a still image (e.g., a still image data set, still image data stream, etc.) that includes embedded information of user-selectable objects in the still image.
- a still image e.g., a still image data set, still image data stream, etc.
- such a media information provider 110 may operate to receive a completed initial still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed initial still image data set.
- a media information provider 110 may operate to form an original still image data set (e.g., a data file or other bounded data structure, etc.) and embed information of user-selectable objects in the original still image data set during such formation (e.g., in the studio, on an enterprise computing system, on a personal computer, etc.).
- the media information provider 110 may be remote from a user's local image presentation system (e.g., located at a premises different from the user's premises) or may be local to the user's local image presentation system (e.g., a personal media player, a digital photo presentation system, a DVR, a personal computing device, a personal electronic device, a personal cellular telephone, a personal digital assistant, a camera, a moving picture camera, an image recorder, a data server, an image and/or television receiver, a television, etc.).
- a user's local image presentation system e.g., located at a premises different from the user's premises
- the media information provider 110 may be remote from a user's local image presentation system (e.g., located at a premises different from the user's premises) or may be local to the user's local image presentation system (e.g., a personal media player, a digital photo presentation system, a DVR, a personal computing device, a personal electronic device, a personal
- the media information provider 110 may alternatively, for example, operate to form and/or communicate a user-selectable object data set that includes information of user-selectable objects in a still image.
- a user-selectable object data set for a still image may, for example, be independent of a data set that generally represents the still image (e.g., generally represents the still image without information of user-selectable objects in such still image).
- such a media information provider 110 may operate to receive a completed still image data set (e.g., a data file or other finite group of data, a data stream, etc.), for example via a communication network and/or on a physical medium, and form the user-selectable object data set independent of the completed still image data set. Also for example, such a media information provider 110 may operate to form both an original still image data set and form the corresponding user-selectable object data set.
- a completed still image data set e.g., a data file or other finite group of data, a data stream, etc.
- a media information provider 110 may operate to form both an original still image data set and form the corresponding user-selectable object data set.
- the exemplary media system 100 may also include a third party image information provider 120 .
- a third party image information provider may, for example, provide information related to a still image. Such information may, for example, comprise information describing user-selectable objects in still images, media guide information, etc.
- a third party image information provider e.g., a party that may be independent of a still image source, media network operator, etc.
- such a third party image information provider 120 may operate to receive an initial completed still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the initial completed still image data set.
- an initial completed still image data set e.g., a data file or other bounded data structure, a data stream, etc.
- the exemplary media system 100 may include one or more communication networks (e.g., the communication network(s) 130 ).
- the exemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which still image and/or information related to still images (e.g., information related to user-selectable objects in still images) may be communicated.
- the communication network 130 may comprise characteristics of any one or more of: a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.
- LAN local area network
- PAN personal area network
- MAN metropolitan area network
- the exemplary media system 100 may include a first media presentation device 140 .
- a first media presentation device 140 may, for example, comprise networking capability enabling such media presentation device 140 to communicate directly with the communication network(s) 130 .
- the first media presentation device 140 may comprise one or more embedded media receivers or transceivers (e.g., a cable television transceiver, satellite television transceiver, Internet modem, wired and/or wireless LAN transceiver, wireless PAN transceiver, etc.).
- the first media presentation device 140 may comprise one or more recording devices (e.g., for recording and/or playing back media content, still images, etc.).
- the first media presentation device 140 may, for example, operate to (which includes “operate when enabled to”) perform any or all of the functionality discussed herein.
- the first media presentation device 140 may, for example, operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects.
- the first media presentation device 140 may also, for example, operate to receive and process information of a still image and information of user-selectable objects in the still image, where such user-selectable object information and such image information are communicated independently (e.g., received in independent data files, received in independent data streams, etc.).
- the exemplary media system 100 may include a first media controller 160 .
- a first media controller 160 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the first media presentation device 140 .
- the first media controller 160 may comprise characteristics of any of a variety of media presentation controlling devices.
- the first media controller 160 may comprise characteristics of a dedicated media center control device, a dedicated image presentation device controller, a dedicated television controller, a universal remote control, a cellular telephone or personal computing device with media presentation control capability, etc.
- the first media controller 160 may, for example, transmit signals directly to the first media presentation device 140 to control operation of the first media presentation device 140 .
- the first media controller 160 may also, for example, operate to transmit signals (e.g., via the communication network(s) 130 ) to the media information provider 110 and/or the third party image information provider 120 to control image information (or information related to an image) being provided to the first media presentation device 140 or other device with image presentation capability, or to conduct other transactions (e.g., business transactions, etc.).
- the first media controller 160 may operate to communicate screen (or display) pointing information with the first media presentation device 140 and/or other devices.
- various aspects of the present invention include a user pointing to a location on a display (e.g., pointing to an animate or inanimate user-selectable object presented in an image on the display).
- the user may perform such pointing in any of a variety of manners.
- One of such exemplary manners includes pointing with a user device.
- the first media controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location.
- the first media controller 160 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects.
- still image information e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.
- the first media controller 160 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., a non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently.
- still image information and information of user-selectable objects in the still image e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., a non-transitory computer readable media), etc.
- the first media presentation device 140 and first media controller 160 provide a non-limiting example of a user's local media presentation system.
- a user's local media presentation system generally refers to the media-related devices that are local to the media presentation system currently being utilized by the user.
- the user's local media presentation system generally refers to the media-related devices that make up the user's home media presentation system.
- the user's local media presentation system generally refers to the media-related devices that make up the premises media presentation system
- Such a user's local media presentation system does not, for example, comprise media network infrastructure devices that are generally outside of the user's current premises (e.g., Internet nodes, cable and/or satellite head-end apparatus, cable and/or satellite communication intermediate communication network nodes) and/or media source devices that are generally managed by media enterprises and generally exist outside of the user's premises.
- Such entities which may be communicatively coupled to the user's local media presentation system, may be considered to be entities remote from the user's local media presentation system (or “remote entities”).
- the exemplary media system 100 may also include a media (e.g., still image) receiver 151 .
- the media receiver 151 may, for example, operate to (e.g., which may include “operate when enabled to”) provide a communication link between a media presentation device and/or media controller and a communication network and/or information provider.
- the media receiver 151 may operate to provide a communication link between the second media presentation device 141 and the communication network(s) 130 , or between the second media presentation device 141 and the media information provider 110 (and/or third party image information provider 120 ) via the communication network(s) 130 .
- the media receiver 151 may comprise characteristics of any of a variety of types of media receivers.
- the media receiver 151 may comprise characteristics of a cable television receiver, a satellite television receiver, a still image receiver, a personal computer, a still picture (or still image) camera, a moving picture camera, etc.
- the media receiver 151 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.).
- the media receiver 151 may also, for example, comprise recording capability (e.g., still image recording and playback, etc.).
- the media receiver 151 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects.
- still image information e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.
- the media receiver 151 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently.
- still image information and information of user-selectable objects in the still image e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.
- the exemplary media system 100 may include a second media controller 161 .
- a second media controller 161 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the second media presentation device 141 and the media receiver 151 .
- the second media controller 161 may comprise characteristics of any of a variety of media presentation controlling devices.
- the second media controller 161 may comprise characteristics of a dedicated media center control device, a dedicated image presentation device controller, a dedicated television controller, a universal remote control, a cellular telephone or personal computing device with media presentation control capability, etc.
- the second media controller 161 may, for example, operate to transmit signals directly to the second media presentation device 141 to control operation of the second media presentation device 141 .
- the second media controller 161 may, for example, operate to transmit signals directly to the media receiver 151 to control operation of the media receiver 151 .
- the second media controller 161 may additionally, for example, operate to transmit signals (e.g., via the media receiver 151 and the communication network(s) 130 ) to the media information provider and/or the third party image information provider 120 to control image information (or information related to an image) being provided to the media receiver 151 , or to conduct other transactions (e.g., business transactions, etc.).
- various aspects of the present invention include a user selecting a user-selectable object in an image.
- selection may, for example, comprise the user pointing to a location on a display (e.g., pointing to an animate or inanimate object presented in an image on the display).
- the user may perform such pointing in any of a variety of manners.
- One of such exemplary manners includes pointing with a user device.
- the second media controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location.
- a user may touch a location of such touch screen to point to an on-screen location (e.g., to select a user-selectable object presented in an image presented on the touch screen).
- the second media presentation device 141 , media receiver 151 and second media controller 161 provide another non-limiting example of a user's local media system.
- the second media controller 161 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects.
- still image information e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.
- the second media controller 161 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently.
- still image information and information of user-selectable objects in the still image e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.
- the exemplary media system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of the exemplary media system 100 unless explicitly claimed.
- FIG. 2 is a flow diagram illustrating an exemplary method 200 for providing embedded information of selectable objects in a still image, for example a still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), in accordance with various aspects of the present invention.
- a media system component e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 , second media controller 161 , shown in FIG. 1 and discussed previously
- a media system component e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 , second media controller 161 , shown in FIG. 1 and discussed previously
- a media system component e.g., the media information provider 110
- any or all aspects of the exemplary method 200 may be implemented in one or more media system components remote from the user's local media system. Also for example, any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local media system.
- the exemplary method 200 may, for example, begin executing at step 205 .
- the exemplary method 200 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided.
- the exemplary method 200 may begin executing in response to a user command to begin (e.g., a user at a media source, a user at a media production studio, a user at a media distribution enterprise, etc.), in response to still image information and/or information of user-selectable objects in a still image arriving at a system entity implementing the method 200 , in response to an electronic request communicated from the external entity to a system entity implementing the method 200 , in response to a timer, in response to a request from an end user and/or a component of a user's local media system for a still image including information of user-selectable objects, in response to a request from a user for a still image where such user is associated in a database with still images comprising user-
- the exemplary method 200 may, for example at step 210 , comprise receiving image information (e.g., picture information) for a still image.
- image information e.g., picture information
- still image information may, for example, be received with or without information describing user-selectable objects in such a still image.
- Step 210 may comprise receiving the still image information from any of a variety of sources, non-limiting examples of which will now be provided.
- step 210 may comprise receiving the still image information from a still image broadcasting company, from a data streaming company, from a still image studio, from a still image database or server, from a camera or other still image recording device, from a scanner, from a facsimile machine, from an Internet still image provider, etc.
- Step 210 may comprise receiving the still image information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.).
- hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.).
- Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200 .
- step 210 may comprise receiving the still image information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).
- step 210 may comprise receiving a completed still image data set (e.g., a complete picture data set) for a still image, the completed still image data set formatted for communicating the still image without information describing user-selectable objects in the still image.
- the received completed still image data set may be in conformance with a still image standard (e.g., JPEG, TIFF, GIF, bmp, etc.).
- a data set may be a data file (or set of logically linked data files) formatted in a JPEG or pdf format for normal presentation on a user's local image presentation system.
- Such a data set of a still image when received at step 210 , might not have information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be added, as will be explained below.
- step 210 may comprise receiving still image information (e.g., picture information) for the still image prior to the still image information being formatted into a completed still image data set for communicating the still image.
- step 210 may comprise receiving still image information (e.g., a bitmap, partially encoded still image information, etc.) that will be formatted in accordance with a still image standard, but which has not yet been so formatted.
- Such a data set of a still image when received at step 210 , might not have information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be added, as will be explained below.
- step 210 may comprise receiving a completed still image data set (e.g., a complete picture data set) for the still image, the completed still image data set formatted for communicating the still image with information describing user-selectable objects in the still image.
- the received completed still image data set may be in conformance with a still image standard (e.g., JPEG, TIFF, GIF, etc.), or a variant thereof, that specifically accommodates information of user-selectable objects in the still image.
- a still image standard e.g., JPEG, TIFF, GIF, etc.
- the received completed still image (or picture) data set may be in conformance with a still image standard (e.g., JPEG et al., TIFF, GIF, JBIG et al., PNG, AGP, AI, ANI, PNG, BMP, DNG, DCS, DCR, ECW, EMF, ICO, PDF, etc.), or a variant thereof, that while not specifically accommodating information of user-selectable objects in the still image, allows for the incorporation of such information in unassigned data fields.
- a data set may be a data file (or set of logically linked data files) formatted in a JPEG format for normal presentation on a user's local image presentation system.
- Such a data set of a still image when received at step 210 , might comprise information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be deleted, modified and/or appended, as will be explained below.
- Step 210 may, for example, comprise receiving the still image information in digital and/or analog signals.
- Step 210 may, for example, comprise receiving the still image information in digital and/or analog signals.
- the examples provided above generally concerned the receipt of digital data, such examples are readily extendible to the receipt of analog still image information.
- step 210 may comprise receiving still image information for a still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of still image information or by any particular manner of receiving still image information unless explicitly claimed.
- the exemplary method 200 may, at step 220 , comprise receiving object information corresponding to a user-selectable object in the still image. Many non-limiting examples of receiving such object information will now be provided.
- Step 220 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which will now be provided.
- step 220 may comprise receiving the user-selectable object information from a media (or image) broadcasting company, from a media (or image) streaming company, from a media (or image) studio, from a still image database or server, from an advertising company, from a commercial enterprise associated with a user-selectable object in a still image, from a person or organization associated with a user-selectable object in a still image, from an Internet still image provider, from a third party still image information source, from an end user's process executing on an end user's personal computer, etc.
- Step 220 may comprise receiving the user-selectable object information from a plurality of independent sources.
- step 220 may comprise receiving the user-selectable object information from each of such respective interested parties.
- step 220 may comprise receiving user-selectable object information corresponding to a user-selectable consumer good in a still image from a provider of such consumer good, receiving user-selectable object information corresponding to an entertainer in the still image from the entertainer's management company, receiving user-selectable object information corresponding to a user-selectable historical landmark in the still image from a society associated with the historical landmark, receiving user-selectable object information corresponding to a user-selectable object in the still image associated with a service from a provider of such service, etc.
- step 220 may comprise aggregating the user-selectable object information received from the plurality of sources (e.g., into a single user-selectable object data set) for ultimate combination of such user-selectable object information with received still image information.
- Step 220 may, for example, comprise receiving the user-selectable object information from a same source as that from which the still image information was received at step 210 or may comprise receiving the user-selectable object information from a different source.
- step 220 may comprise receiving the user-selectable object information from an advertising company, while step 210 comprises receiving the still image information from a still image studio.
- step 220 may comprise receiving the user-selectable object information from a commercial enterprise associated with a consumer good object presented in the still image, while step 210 comprises receiving the still image information from an image server of a sports network.
- step 220 may comprise receiving the user-selectable object information directly from a computer process that generates such information.
- a computer process may display a still image on an operator station and utilize graphical tools (e.g., boxes or other polygons, edge detection routines, etc.) to define a user-selectable object in the still image.
- Such a computer process may then output information describing the object in the still image.
- Step 220 may comprise receiving the information output from such process.
- Step 220 may comprise receiving the user-selectable object information via any of a variety of types of communication networks, many examples of such networks were provided previously.
- Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).
- Such networks may, for example, comprise a media network (e.g., terrestrial and/or satellite media network).
- Step 220 may, for example, comprise receiving the user-selectable object information via a same communication network as that via which the still image information was received at step 210 or may comprise receiving the user-selectable object information from a different communication network.
- step 220 may comprise receiving the user-selectable object information via a general data communication network (e.g., the Internet), while step 210 comprises receiving the still image information via a television network.
- step 220 may comprise receiving the user-selectable object information via a general data network, while step 210 comprises receiving the still image information from a computer readable medium (e.g., a non-transitory computer readable medium).
- a computer readable medium e.g., a non-transitory computer readable medium
- Step 220 may comprise receiving the user-selectable object information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.).
- hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.).
- Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200 .
- step 220 may comprise receiving the user-selectable object information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).
- the object information corresponding to one or more user-selectable objects that is received at step 220 may comprise any of a variety of characteristics, non-limiting examples of which will now be provided.
- such user-selectable object information may comprise information describing and/or defining the user-selectable object that is shown in the still image.
- Such information may, for example, be processed by a recipient of such information to identify an object that is being selected by a user.
- Such information may, for example, comprise information describing boundaries associated with a user-selectable object in the still image (e.g., actual object boundaries (e.g., an object outline), areas generally coinciding with a user-selectable object (e.g., a description of one or more geometric shapes that generally correspond to a user-selectable object), selection areas that when selected indicate user-selection of a user-selectable object (e.g., a superset and/or subset of a user-selectable object in the still image), etc.
- Such information may, for example, describe and/or define the user-selectable in a still image coordinate system.
- such user-selectable object information may comprise information describing the object, where such information may be presented to the user upon user-selection of a user selectable object.
- object information may comprise information describing physical characteristics of a user-selectable object, background information, historical information, general information of interest, location information, financial information, travel information, commerce information, personal information, etc.
- such user-selectable object information may comprise information describing and/or defining actions that may be taken upon user-selection of a user-selectable object, non-limiting examples of such actions and/or related information corresponding to a respective user-selectable object will now be presented.
- such user-selectable object information may comprise information describing a one or more manners of determining information to present to the user (e.g., retrieving such information from a known location, conducting a search for such information, etc.), establishing a communication session by which a user may interact with networked entities associated with a user-selected object, interacting with a user regarding display of a user-selected object and/or associated information, etc.
- such user-selectable object information may comprise information describing one or more manners of obtaining one or more sets of information, where such information may then, for example, be presented to the user.
- information may comprise a memory address (or data storage address) and/or a communication network address (e.g., an address of a networked data server, a URL, etc.), where such address may correspond to a location at which information corresponding to the identified object may be obtained.
- Such information may, for example, comprise a network address of a component with which a communication session may be initiated and/or conducted (e.g., to obtain information regarding the user-selected object, to interact with the user regarding the selected object, etc.).
- the user-selectable object information comprises information to present to a user upon user-selection of a selectable object in a still image
- information may comprise any of a variety of different types of information related to the user-selected object.
- information may comprise information describing the user-selectable object (e.g., information describing aspects of the object, history of the object, design of the object, source of the object, price of the object, critiques of the object, information provided by commercial enterprises producing and/or providing such object, etc.), information indicating to the user how the user may obtain the selected object, information indicating how the user may utilize the selected object, etc.
- the information may, for example, comprise information of one or more non-commercial organizations associated with, and/or having information pertaining to, the identified user-selected object (e.g., non-profit and/or government organization contact information, web site address information, etc.).
- the information corresponding to a user-selectable object in the still image may comprise information related to conducting a search for information corresponding to the user-selectable object.
- information may, for example, comprise network search terms that may be utilized in a search engine to search for information corresponding to the user-selected object.
- Such information may also comprise information describing the network boundaries of such a search, for example, identifying particular search networks, particular servers, particular addresses, particular databases, etc.
- the information corresponding to a user-selectable object may describe a manner in which a system is to intact with a user to more clearly identify information desired by the user.
- information may comprise information specifying user interaction that should take place when an amount of information available and corresponding to a user-selectable object exceeds a particular threshold.
- Such user interaction may, for example, help to reduce the amount of information that may ultimately be presented to the user.
- information may comprise information describing a user interface comprising providing a list (or menu) of types of information available to the user and soliciting information from the user regarding the selection of one or more of the listed types of information.
- the user-selectable object information may comprise information describing the manner in which a communication session may be established and/or management.
- an action associated with a user-selectable object comprises providing a user interface by which a user may initiate and perform a commercial transaction regarding a user-selectable object
- the user-selectable object information may comprise information describing the manner in which the commercial transaction is to be performed (e.g., order forms, financial information exchange, order tracking, etc.).
- various user-selectable objects may, for example, be associated with any of a variety of respective actions that may be taken upon selection of a respective user-selectable object by a user.
- Such actions e.g., information retrieval, information searching, communication session management, commercial transaction management, etc.
- Such actions may, for example, be included in a table or other data structure indexed by the identity of a respective user-selectable object.
- object information corresponding to user-selectable objects in a still image may comprise: athlete information (e.g., statistics, personal information, professional information, history, etc.), entertainer information (e.g., personal information, discography and/or filmography information, information of related organizations, fan club information, photograph and/or video information, etc.), landmark information (e.g., historical information, visitation information, location information, mapping information, photo album information, visitation diary, charitable donation information, etc.), political figure information (e.g., party affiliation, stances on particular issues, history, financial information, voting record, attendance record, etc.), information regarding general types of objects (e.g., information describing actions to take upon user-selection of a person object, of a consumer good object, of a landmark object, etc.) and/or specific objects (e.g., information describing actions to take when a particular person object is selected, when a particular consumer good object is selected, when a particular landmark object is selected, etc.).
- athlete information e.g.,
- the above-mentioned types of information corresponding to user-selectable objects a still image may be general to all eventual viewers (or recipients) of the still image, but may also be customized to a particular target user and/or end user.
- information may be customized to a particular user (e.g., based on income level, demographics, age, employment status and/or type, education level and/or type, family characteristics, religion, purchasing history, neighborhood characteristics, home characteristics, health characteristics, etc.
- such information may also be customized to a particular geographical location or region.
- step 220 may comprise receiving object information corresponding to a user-selectable object in the still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of such user-selectable object information or by any particular manner of receiving such user-selectable object information unless explicitly claimed.
- the exemplary method 200 may, at step 230 , comprise combining the received still image information (e.g., as received at step 210 ) and the received user-selectable object information (e.g., as received at step 220 ) in a combined data set.
- the received still image information e.g., as received at step 210
- the received user-selectable object information e.g., as received at step 220
- Many non-limiting examples of such combining will now be provided.
- step 210 may comprise receiving still image information (e.g., a still image data set) for a still image (e.g., a photographic image) by, at least in part, receiving a completed still image data set for the still image (e.g., formatted in accordance with a still image communication and/or compression standard), where the completed still image data set is formatted for communicating (or storing) the still image without information describing user-selectable objects in the still image.
- still image information e.g., a still image data set
- a still image e.g., a photographic image
- a completed still image data set for the still image e.g., formatted in accordance with a still image communication and/or compression standard
- step 230 may comprise combining the received still image information and the received user-selectable object information by, at least in part, inserting the received user-selectable object information in the completed still image data set to create a combined data set comprising the received still image data set and the received user-selectable object information.
- step 230 may comprise inserting the received user-selectable object information in data fields of the completed still image data set that are not assigned by the still image standard for any specific type of information (e.g., inserting such information into unassigned data fields and/or metadata fields provided by the still image standard, adding new data fields to the still image standard, etc.).
- a still image standard e.g., a JPEG standard
- Such inserting may, for example, comprise inserting the received user-selectable object information in data fields of the completed still image data set that are interleaved with data fields carrying still image data.
- inserting may be performed in accordance with a format alternating still image data and user-selectable object information (or data) on a pixel-by-pixel basis (e.g., sequencing pixel 1 still image data, pixel 1 user-selectable object information, sequencing pixel 2 still image data, pixel 2 user-selectable object information, etc.), by groups of pixels (e.g., pixel 1-A still image data, pixel 1-A user-selectable object information, pixel A-N still image data, pixel A-N user-selectable object information, etc.), by lines of pixels, by blocks of pixels, etc.
- a format alternating still image data and user-selectable object information (or data) on a pixel-by-pixel basis e.g., sequencing pixel 1 still image data,
- user-selectable object information need not be strictly placed with the still image data for the still image in which the user-selectable object appears.
- information of user-selectable objects in a still image and/or portion thereof may be communicated before and/or after the image data set for the entire still image is communicated.
- step 230 may comprise inserting the received user-selectable object information in the data fields of the completed still image data set that are specifically assigned by the still image standard to contain information of user-selectable objects.
- step 210 may comprise receiving still image information (e.g., a still image data set) for a still image (e.g., a photographic image) by, at least in part, receiving still image information for the still image prior to the still image information being formatted into a completed still image data set for communicating (or storing) the still image.
- a still image e.g., a photographic image
- step 210 may comprise receiving information describing the still image that has yet to be formatted into a data set that conforms to a particular still image standard (e.g., bitmap information, DCT information, etc., which has yet to be placed into a self-contained JPEG data set for communicating and/or storing the still image).
- step 230 may comprise combining the received still image information and the received user-selectable object information into a completed still image data set that is formatted for communicating and/or storing the still image with information describing user-selectable objects in the still image (e.g., into a single cohesive data set, for example, a single data file or other data structure, into a plurality of logically linked data files or other data structures, etc.).
- a single cohesive data set for example, a single data file or other data structure, into a plurality of logically linked data files or other data structures, etc.
- such a completed still image data set may be formatted in accordance with a still image standard that specifically assigns respective data fields (or elements) to information describing the still image and to user-selectable object information.
- such a completed still image data set may be formatted in accordance with a still image standard that specifically assigns data fields to information describing a still image, but does not specifically assign data fields to user-selectable object information (e.g., utilizing general-purpose unassigned data fields, adding new data fields to the standard, etc.).
- step 210 may comprise receiving still image information for a still image by, at least in part, receiving an initial combined still image data set that comprises initial still image information and initial user-selectable object information corresponding to user-selectable objects in the still image.
- the received initial combined still image data set may have already been formed into a single cohesive data set that comprises the still image information for the still image and information of user-selectable objects in the still image.
- step 230 may comprise modifying the initial user-selectable object information of the initial combined still image data set in accordance with the received user-selectable object information (e.g., as received at step 220 ).
- modifying may, for example and without limitation, comprise adding the received object information to the initial object information in the initial combined still image data set (e.g., in unused unassigned data fields and/or in unused data fields that have been specifically assigned to contain user-selectable object information, etc.).
- Such modifying may comprise changing at least a portion of the initial object information of the initial combined still image data set in accordance with the received user-selectable object information (e.g., changing information defining a user-selectable object in a presented still image, changing information about a user-selectable object to be presented to a user, changing information regarding any action that may be performed upon user-selection of a user-selectable object, etc.).
- changing information defining a user-selectable object in a presented still image changing information about a user-selectable object to be presented to a user, changing information regarding any action that may be performed upon user-selection of a user-selectable object, etc.
- such modifying may comprise deleting at least a portion of the initial object information in accordance with the received user-selectable object information (e.g., in a scenario in which the received user-selectable object information includes a command or directive to remove a portion or all information corresponding to a particular user-selectable object).
- step 230 may comprise performing such operations automatically (i.e., without real-time interaction with a user while such operations are being performed) and may also be performed with user interaction.
- the received still image information and the received user-selectable object information may each be uniquely identified to assist in merging such information.
- step 230 may comprise analyzing such respective unique identifications to determine the still image data set in which the user-selectable object information is to be inserted.
- the user-selectable object information for a particular user-selectable object may comprise information indentifying the specific still image in which the user-selectable object appears.
- Such information may be utilized at step 230 to determine the appropriate data set (e.g., still image data file or other bounded data set0 in which to place the user-selectable object information.
- step 230 may comprise presenting an operator with a view of the still image and a view of a user-selectable object in such still image for which information is being added to a combined dataset.
- Step 230 may then comprise interacting with the operator to obtain permission and/or directions for combining the still image and user-selectable object information.
- step 230 may comprise encrypting the user-selectable object information or otherwise restricting access to such information. For example, in a scenario in which access to such information is provided on a subscription basis, in a scenario in which providers of such information desire to protect such information from undesirable access and/or manipulation, etc., such information protection may be beneficial.
- step 230 may comprise combining the received still image information (e.g., as received at step 210 ) and the received user-selectable object information (e.g., as received at step 220 ) in a combined data set. Accordingly, the scope of various aspects of the present invention should not be limited by any particular manner of performing such combining and/or any particular format in which such a combined data set may be placed unless specifically claimed.
- the exemplary method 200 may, at step 240 , comprise communicating the combined data set(s) (e.g., as formed at step 230 ) to one or more recipient systems or devices.
- Such communication may comprise characteristics of any of a variety of types of communication, non-limiting examples of which will now be presented.
- Step 240 may, for example, comprise communicating the combined data set(s) via a communication network (e.g., a television communication network, a telecommunication network, a general data communication network (e.g., the Internet, a LAN, a PAN, etc.), etc.). Many non-limiting examples of such communication network were provided previously.
- Step 240 may, for example, comprise broadcasting, multi-casting and/or uni-casting the combined data set over one or more communication networks.
- Step 240 may also, for example, comprise communicating the combined data set(s) to another system and/or device via a direct conductive path (e.g., via a wire, circuit board trace, conductive trace on a die, etc.).
- a direct conductive path e.g., via a wire, circuit board trace, conductive trace on a die, etc.
- step 240 may comprise storing the combined data set(s) on a computer readable medium (e.g., a DVD, a CD, a Blueray ® disc, a laser disc, a magnetic tape, a hard drive, a diskette, etc.). Such a computer readable medium may then, for example, be shipped to a distributor and/or ultimate recipient of the computer readable medium. Further for example, step 240 may comprise storing the combined data set(s) in a volatile and/or non-volatile memory device (e.g., a flash memory device, a one-time-programmable memory device, an EEPROM, a RAM, etc.).
- a volatile and/or non-volatile memory device e.g., a flash memory device, a one-time-programmable memory device, an EEPROM, a RAM, etc.
- step 240 may comprise storing (or causing or otherwise participating in the storage of) the combined data set(s) in a media system component (e.g., a component or device of the user's local media (or still image presentation) system and/or a component or device of a media (or still image) provider and/or a component or device of any still image information source.
- a media system component e.g., a component or device of the user's local media (or still image presentation) system and/or a component or device of a media (or still image) provider and/or a component or device of any still image information source.
- step 240 may comprise storing the combined dataset(s), or otherwise participating in the storage of the combined dataset(s), in a component of the user's local media system (e.g., in an image presentation device, a digital video recorder, a media receiver, a media player, a media system controller, personal communication device, a local networked database, a local networked personal computer, etc.).
- a component of the user's local media system e.g., in an image presentation device, a digital video recorder, a media receiver, a media player, a media system controller, personal communication device, a local networked database, a local networked personal computer, etc.
- Step 240 may, for example, comprise communicating the combined data set in serial fashion.
- step 240 may comprise communicating the combined data set (comprising interleaved still image information and user-selectable object information) in a single data stream (e.g., via a general data network, via a television or other media network, stored on a hard medium, for example a non-transitory computer-readable medium, in such serial fashion, etc.).
- step 240 may comprise communicating the combined data set in parallel data streams, each of which comprises interleaved still image information and user-selectable object information (e.g., as opposed to separate distinct respective data streams for each of still image information and user-selectable object information).
- step 240 may comprise communicating the combined data set(s) (e.g., as formed at step 230 ) to one or more recipient systems or devices (e.g., an end user or associated system, media (or image) provider or associated system, an advertiser or associated system, a media (or image) producer or associated system, a media (or image) database, a media (or image) server, etc.).
- recipient systems or devices e.g., an end user or associated system, media (or image) provider or associated system, an advertiser or associated system, a media (or image) producer or associated system, a media (or image) database, a media (or image) server, etc.
- the exemplary method 200 may, for example at step 295 , comprise performing continued operations.
- Step 295 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below.
- step 295 may comprise returning execution flow to any of the previously discussed method steps.
- step 295 may comprise returning execution flow of the exemplary method 200 to step 220 for receiving additional user-selectable object information to combine with still image information.
- step 295 may comprise returning execution flow of the exemplary method 200 to step 210 for receiving additional still image information and user-selectable object information to combine with such received still image information.
- step 295 may comprise returning execution flow of the exemplary method 200 to step 240 for additional communication of the combined information to additional recipients.
- step 295 may comprise performing continued operations (e.g., performing additional operations corresponding to combining still image information and information of user-selectable objects in such still images, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.
- FIG. 3 such figure is a flow diagram illustrating an exemplary method 300 for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention.
- the exemplary method 300 may, for example, share any or all characteristics with the exemplary method 200 illustrated in FIG. 2 and discussed previously. Any or all aspects of the exemplary method 300 may, for example, be implemented in a media system component (e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 , second media controller 161 , shown in FIG. 1 and discussed previously) and/or a plurality of such media system components operating in conjunction.
- a media system component e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 ,
- any or all aspects of the exemplary method 300 may be implemented in one or more media (or image) system components remote from the user's local media system.
- any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local media (or image) system.
- the exemplary method 300 may, for example, begin executing at step 305 .
- the exemplary method 300 may begin executing in response to any of a variety of causes or conditions.
- Step 305 may, for example, share any or all characteristics with step 205 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- the exemplary method 300 may, for example at step 310 , comprise receiving image information for a still image.
- Step 310 may, for example, share any or all characteristics with step 210 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- step 310 may comprise receiving any of the various types of still image information from any of the various sources of still image information via any of the various communication media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.
- step 310 may comprise, for example at sub-step 312 , receiving a completed still image data set for the still image, the completed still image data set formatted for communicating and/or storing the still image without information describing user-selectable objects in the still image.
- step 310 may comprise, for example at sub-step 314 , receiving still image information for the still image prior to the still image information being formatted into a completed still image data set for communicating and/or storing the still image.
- step 310 may comprise, for example at sub-step 316 , receiving a completed still image data set for the still image, the completed still image data set formatted for communicating and/or storing the still image with information describing user-selectable objects in the still image.
- the exemplary method 300 may, for example at step 320 , comprise receiving object information corresponding to a user-selectable object in the still image.
- Step 320 may, for example, share any or all characteristics with step 220 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- step 320 may comprise receiving any of the various types of user-selectable object information from any of the various sources of user-selectable object information via any of the various types of media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.
- step 320 may comprise, for example at sub-step 322 , receiving user-selectable object information comprising information describing and/or defining the user-selectable object that is shown in the still image (e.g., object dimension information, object movement information, etc.). Also for example, step 320 may comprise, for example at sub-step 324 , receiving user-selectable object information comprising information regarding the user-selectable object that may be presented to the user upon user-selection of such object in a still image.
- user-selectable object information comprising information describing and/or defining the user-selectable object that is shown in the still image (e.g., object dimension information, object movement information, etc.).
- step 320 may comprise, for example at sub-step 324 , receiving user-selectable object information comprising information regarding the user-selectable object that may be presented to the user upon user-selection of such object in a still image.
- step 320 may comprise, for example at sub-step 326 , receiving user-selectable object information comprising information describing and/or defining actions that may be taken upon user-selection of a user-selectable object (e.g., retrieving and/or obtaining and/or searching for information about a user-selectable object, information specifying a manner in which a system is to interact with a user regarding a user-selected object, searching for information, establishing and/or maintaining communication sessions, information describing the manner in which the commercial transaction is to be performed, etc.).
- user-selectable object information comprising information describing and/or defining actions that may be taken upon user-selection of a user-selectable object (e.g., retrieving and/or obtaining and/or searching for information about a user-selectable object, information specifying a manner in which a system is to interact with a user regarding a user-selected object, searching for information, establishing and/or maintaining communication sessions
- the exemplary method 300 may, for example at step 330 , comprise combining the received still image information (e.g., as received at step 310 ) and the received user-selectable object information (e.g., as received at step 320 ) in a combined data set.
- Step 330 may, for example, share any or all characteristics with step 230 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- step 330 may comprise, for example at sub-step 332 , inserting the received user-selectable object information in a completed still image data set that was received at step 320 (e.g., inserting such user-selectable object information in fields of the still image data set that are specified by a standard for carrying such user-selectable object information, inserting such user-selectable object information in fields of the still image data set that are not specifically allocated for a particular type of data, etc.).
- step 330 may comprise, for example at sub-step 334 , combining received still image data and received user-selectable object information into a completed still image data set that is formatted for communicating the still image with information describing user-selectable objects in the still image. Additionally for example, step 330 may comprise, for example at sub-step 336 , modifying initial user-selectable object information of an initial combined still image data set in accordance with received user-selectable object information.
- the exemplary method 300 may, for example at step 340 , comprise communicating the combined data set(s) (e.g., as formed at step 230 ) to one or more recipient systems or devices.
- Step 340 may, for example, share any or all characteristics with step 240 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- step 340 may comprise, for example at sub-step 342 , communicating the combined data set(s) via a communication network (e.g., any of a variety of communication networks discussed herein, etc.). Also for example, step 340 may comprise, for example, at sub-step 344 , communicating the combined data set(s) by storing the combined data set(s) on a non-transitory computer readable medium and/or by transmitting the combined data set(s) to another device or system to perform such storage. Additionally for example, step 340 may comprise, for example, at sub-step 346 , communicating the combined data set in a single serial stream (e.g., comprising interleaved still image data and user-selectable object information).
- a communication network e.g., any of a variety of communication networks discussed herein, etc.
- step 340 may comprise, for example, at sub-step 344 , communicating the combined data set(s) by storing the combined data set(s) on a non-transitory computer
- step 340 may comprise, for example, at sub-step 348 , communicating the combined data set in a plurality of parallel serial streams (e.g., each of such streams comprising interleaved still image data and user-selectable object information).
- a plurality of parallel serial streams e.g., each of such streams comprising interleaved still image data and user-selectable object information.
- the exemplary method 300 may, for example at step 395 , comprise performing continued operations.
- Step 395 may, for example, share any or all characteristics with step 295 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.
- various aspects of the present invention may comprise incorporating information of user-selectable objects in a still image into a combined data set that comprises such information along with information generally descriptive of the still image, and communication of such a combined data set.
- various aspects of the present invention may comprise forming, communicating and/or storing a user-selectable object data set that is independent of a corresponding still image data set.
- FIG. 4 is a flow diagram illustrating an exemplary method 400 for providing information of selectable objects in a still image in an information stream independent of the still image, in accordance with various aspects of the present invention.
- Any or all aspects of the exemplary method 400 may, for example, be implemented in a media system component (e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 , second media controller 161 , shown in FIG. 1 and discussed previously) and/or a plurality of such media system components operating in conjunction.
- a media system component e.g., the media information provider 110 , third party image information provider 120 , a component of a communication network 130 , first media presentation device 140 , first media controller 160 , second media presentation device 141 , media receiver 151 , second media controller 161 , shown in FIG. 1 and discussed previously
- a media system component e
- any or all aspects of the exemplary method 400 may be implemented in one or more media system components remote from the user's local media system. Also for example, any or all aspects of the exemplary method 400 may be implemented in one or more components of the user's local media system.
- the exemplary method 400 may, for example, begin executing at step 405 .
- the exemplary method 400 may begin executing in response to any of a variety of causes and/or conditions.
- Step 405 may, for example, share any or all characteristics with steps 205 and 305 of the exemplary methods 200 and 300 illustrated in FIGS. 2 and 3 and discussed previously.
- the exemplary method 400 may, for example at step 410 , comprise receiving image information for a still image.
- Step 410 may, for example, share any or all characteristics with steps 210 and 310 of the exemplary methods 200 and 300 illustrated in FIGS. 2 and 3 and discussed previously.
- the exemplary method 400 may, for example at step 420 , comprise determining user-selectable object information corresponding to one or more user-selectable objects in a still image.
- Step 420 may, for example, share any or all characteristics with steps 220 and 320 of the exemplary methods 200 and 300 illustrated in FIGS. 2 and 3 and discussed previously.
- step 420 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which were provided previously (e.g., in the discussion of step 220 and elsewhere herein).
- the object information corresponding to one or more user-selectable objects that is determined at step 210 may comprise any of a variety of characteristics, numerous examples of such object information were provided previously (e.g., in the discussion of step 220 and elsewhere herein).
- the exemplary method 400 may, at step 430 , comprise forming a user-selectable object data set comprising the determined user-selectable object information (e.g., as determined at step 420 ), where the user-selectable object data set is independent of a still image data set (e.g., as received at step 410 ) generally representative of the still image.
- Step 430 may comprise performing such data set formation in any of a variety of manners, non-limiting examples of which will now be presented.
- step 430 may comprise forming the user-selectable object data set (e.g., a data file or other data structure, a logical grouping of data, etc.) in a manner that is spatially synchronized with a still image (or a still image data set representative of a still image).
- the user-selectable object data set e.g., a data file or other data structure, a logical grouping of data, etc.
- step 430 may comprise forming the user-selectable object data set by, at least in part, parsing the user-selectable object information in a manner that logically mirrors the still image data set blocks.
- the user-selectable object information describing the user-selectable object may be placed in a corresponding block (e.g., Nth block, data segment, etc.) of the user-selectable object data set.
- the user-selectable object data set might include null (or no) information in blocks corresponding to still image blocks that do not include any user-selectable objects.
- the user-selectable object data set need not include information for block P if corresponding block P of the still image does not include any user-selectable objects.
- step 430 may comprise forming the user-selectable object data set by, at least in part, including information indicating the blocks of the still image in which the user-selectable object appears (e.g., along with the dimensions of the user-selectable object and/or other spatially descriptive information).
- step 430 may comprise incorporating information into the user-selectable object data set that indicates the user-selectable object appears in blocks A-B of the still image, along with information describing the dimensions and/or locations of the user-selectable object in such blocks of the still image.
- the user-selectable object data set includes information that spatially synchronizes the user-selectable object data set to the still image data set
- not all information of the user-selectable object data set need be so synchronized.
- information corresponding to user-selectable objects that is not spatially-specific may be included in the user-selectable object data set in an unsynchronized (or asynchronous) manner.
- information describing user-selectable objects (or selectable regions thereof) as such user-selectable objects appear in a presented still image may be spatially-synchronized (e.g., block-synchronized) to the still image data set, while information to be presented to the user upon user-selection of such user-selectable objects and/or information describing any action to take upon user-selection of such user-selectable objects may be included in the user-selectable object data set in an unsynchronized manner (e.g., in a data structure (or sub-data structure) that is indexed by object identity to retrieve such information).
- an unsynchronized manner e.g., in a data structure (or sub-data structure) that is indexed by object identity to retrieve such information.
- the user-selectable object data set may comprise time synchronization information indicating that such user-selectable object data set corresponds to the particular time window.
- step 430 may comprise incorporating data markers into the user-selectable object data set that correspond to respective markers in a still image data set.
- step 430 may comprise incorporating data pointers into the user-selectable object data set that point to respective absolute and/or relative locations within a still image data set.
- the user-selectable object information may also comprise information to be provided to the user upon selection of a user-selectable object, information describing communication sessions and/or other actions that may be performed upon selection of the user-selectable object, etc.
- information may be incorporated into the user-selectable object data set at step 430 .
- step 430 may comprise incorporating such user-selectable object information into the user-selectable object data set in a manner that provides for indexing such information by object identity.
- such information need only be incorporated into the user-selectable object data set one time (e.g., positioned in the user-selectable object data set such that a recipient of the user-selectable object data set will have received such information prior to user selection of the user-selectable object corresponding to such information).
- step 430 may comprise forming the user-selectable object data set such that, when communicated to a user's local media (or image) presentation system, information of actions to perform upon user selection of the consumer good in the still image will have been received by the user's local media (or image) presentation system prior to the user's first opportunity to select the consumer good in the still image.
- the user-selectable object data set formed at step 430 may comprise characteristics of different types of data sets (or structures).
- step 430 may comprise forming a data file that comprises the user-selectable object information.
- a user-selectable object data file may, for example, comprise metadata that correlates the user-selectable object data file to one or more corresponding still image data files that are utilized to communicate the general still image (e.g., without user-selectable object information).
- Step 430 may also, for example, comprise forming an array of the user-selectable object information.
- Such an array may, for example, comprise an array or records associated with respective user-selectable objects in a still image and may be indexed and/or sorted by object identification.
- step 430 may comprise forming a linked list of respective data records corresponding to user-selectable objects in the still image.
- Such a linked list may, for example, be a multi-dimensional linked list with user-selectable object in a first dimension and respective records associated with different types of information associated with a particular user-selectable object in a second dimension.
- the user-selectable object data set may be independent of one or more still image data sets generally representative of the still image.
- Such an implementation advantageously provides for independent formation and maintenance of the user-selectable object data set that corresponds to the still image.
- a data set e.g., a still image data file, JPEG file, etc.
- a data set for user-selectable objects in the still image may be developed (e.g., by an advertising company, by a sponsor, by a network operator, by one or more components of a user's local media system, etc.) independently.
- the user-selectable object data set may be developed and/or changes may be made to the user-selectable object data set without impacting the still image data set.
- user-selectable object information may be customized to a user or group of users.
- a plurality of different user-selectable object data sets may be developed that each correspond to the same still image data set.
- step 220 may comprise forming a first user-selectable object data set for a New York audience or recipient of a still image, and forming a second user-selectable object data set for a Los Angeles audience or recipient of the still image without necessitating modification of the still image data set, which communicates the still image in the same manner to each of the New York and Los Angeles audiences or recipients.
- step 430 may comprise forming a user-selectable object data set comprising the determined user-selectable object information (e.g., as determined at step 220 ), where the user-selectable object data set is independent of a still image data set generally representative of the still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular types of user-selectable object data, characteristics of particular types of user-selectable object data sets, and/or characteristics of any particular manner of forming user-selectable object data sets unless explicitly claimed.
- the exemplary method 400 may, at step 440 , comprise communicating the formed user-selectable object data set (e.g., as formed at step 430 ) to one or more recipients.
- Step 440 may comprise performing such communicating in any of a variety of manners, non-limiting examples of which will now be provided.
- step 440 may comprise communicating the user-selectable object data set in one or more data streams (which may be called “user-selectable object data streams” herein) independent of one or more still image data streams that generally communicate the still image (i.e., that generally communicate the still image data set).
- the user-selectable object data set may comprise information of user-selectable objects in the still image that supplement (e.g., append and/or amend) information of user-selectable objects that might be present in the still image data set.
- Step 440 may, for example, comprise communicating the user-selectable object data set time-synchronized to communication of the still image data set. For example, even in a scenario in which the user-selectable object data set is independent of the general still image data set, step 440 may still time-synchronize communication of the user-selectable object data set with communication of the general still image data set.
- step 440 may comprise communicating the user-selectable object data concurrently (e.g., simultaneously and/or pseudo-simultaneously in a time-sharing manner) with communication of the still image data set that generally communicates the still image.
- concurrent communication may comprise communicating at least a portion of the user-selectable object data set and at least a portion of the still image data set in a time-multiplexed manner (e.g., via a shared communication channel (e.g., a frequency channel, a code channel, a time/frequency channel, etc.)).
- a shared communication channel e.g., a frequency channel, a code channel, a time/frequency channel, etc.
- concurrent communication may comprise communicating the user-selectable object data set in parallel with communication of the still image data set (e.g., on separate respective sets of one or more parallel communication channels).
- step 440 may comprise communicating the user-selectable object data set via at least one communication channel that is different from one or more communication channels over which the still image data set is communicated. For example, even in a scenario in which the user-selectable object data set and the still image data set are communicated over at least one shared communication channel, step 440 may comprise communicating the user-selectable object data set in at least one communication channel that is different from the communication channel(s) over which the still image data set is communicated.
- Step 440 may, for example, comprise communicating the user-selectable object data set over a first communication network that is different from a second communication network over which the still image data set is communicated.
- step 440 may comprise communicating the user-selectable object data set over a first communication network (e.g., a first general data communication network), where the still image data set is communicated over a second communication network (e.g., a second general data communication network).
- a first communication network e.g., a first general data communication network
- a second communication network e.g., a second general data communication network
- Step 440 may, for example, comprise communicating the user-selectable object data set over a first type of communication network that is different from a second type of communication network over which the still image data set is communicated.
- step 440 may comprise communicating the user-selectable object data set over a first general data communication network, where the still image data set is communicated over a television communication network (e.g., a cable television network, a satellite television network, etc.).
- a television communication network e.g., a cable television network, a satellite television network, etc.
- step 440 may comprise communicating the user-selectable object data set utilizing a first communication protocol that is different from a second communication protocol that is utilized to communicate the still image data set.
- step 440 may comprise communicating the user-selectable data set utilizing TCP/IP, while the general still image data set is communicated utilizing a cable television protocol.
- step 440 may comprise communicating the user-selectable object data set to a first set of one or more user local media (or image) presentation systems, where the first set is a subset of a second set of user local media (or image) presentation systems to which the still image data set is communicated.
- step 440 may comprise multicasting the user-selectable object data set to a multicast group, where the still image data set is broadcast to a superset of the multicast group.
- step 440 may comprise unicasting the user-selectable object data set to a single user local media (or image) presentation system, where the still image data set is broadcast or multicast to a superset of the single user.
- step 440 may comprise communicating the user-selectable object data set to a first set of one or more components of a user's local media (or image) presentation system, where at least a portion of such first set is different from a second set of one or more components of the user's local media (or image) presentation system to which the still image data set is communicated.
- step 440 may comprise communicating the user-selectable object data set to the media controller and not to the media receiver.
- Step 440 may comprise communicating the user-selectable object data set with or without regard for the timing of the communication of the still image (e.g., the still image data set) to which the user-selectable object data set corresponds.
- step 440 may comprise communicating the user-selectable object data set whenever the still image data set is communicated.
- step 440 may comprise communicating the entire user-selectable object data set before the still image data set is communicated. In such a scenario, the recipient of the communicated user-selectable object data set may be assured of having received such data set prior to receipt of the still image to which the user-selectable object data set corresponds.
- step 440 may also comprise communicating the user-selectable object data set to a storage device where the user-selectable object data set is stored in a storage medium, for example an optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer-readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.), etc.
- a storage medium for example an optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer-readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.), etc.
- Such memory may, for example, be a temporary
- step 440 may comprise communicating the user-selectable object data set to a storage device where the user-selectable object data set is stored in a same storage medium as a medium on which the still image data set is stored.
- the user-selectable object data set may be stored in one or more data structures that are independent of one or more data structures in which the still image data set is stored (e.g., stored in one or more separate data files).
- step 440 may comprise communicating the user-selectable object data set to one or more devices of the user's local media system (e.g., a media receiver, a digital video recorder, a media presentation device, a media controller, a personal computer, etc.) and/or one or more devices of a media source system and/or one or more devices of a media distribution system for storage in such device(s).
- devices of the user's local media system e.g., a media receiver, a digital video recorder, a media presentation device, a media controller, a personal computer, etc.
- step 440 may comprise communicating the formed user-selectable object data set (e.g., as formed at step 430 ) to one or more recipients (e.g., an end user or associated system, still image provider or associated system, an advertiser or associated system, a still image producer or associated system, a still image database, a still image server, etc.).
- recipients e.g., an end user or associated system, still image provider or associated system, an advertiser or associated system, a still image producer or associated system, a still image database, a still image server, etc.
- the exemplary method 400 may, for example at step 495 , comprise performing continued operations.
- Step 495 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below.
- step 495 may comprise returning execution flow to any of the previously discussed method steps.
- step 495 may comprise returning execution flow of the exemplary method 400 to step 420 for receiving additional user-selectable object information to form into an independent user-selectable object data set and communicate.
- step 495 may comprise returning execution flow of the exemplary method 400 to step 440 for additional communication of the user-selectable object data set (e.g., to additional recipients).
- step 495 may comprise performing continued operations (e.g., performing additional operations corresponding to forming and/or communicating user-selectable object data sets related to user-selectable objects in a still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.
- FIG. 5 such figure is a diagram illustrating an exemplary media system 500 (e.g., single media system component and/or plurality of media system components), in accordance with various aspects of the present invention.
- the exemplary media system 500 may, for example, share any or all characteristics with one or more of the media system components illustrated in FIG. 1 and discussed previously.
- the exemplary media system 500 may correspond to any of the media system components illustrated in FIG. 1 (or the like) or any group of the media system components illustrated in FIG. 1 (or the like).
- the exemplary media system 500 may comprise characteristics of a computing system (e.g., a personal computer, a mainframe computer, a digital signal processor, etc.).
- the exemplary media system 500 (e.g., various modules thereof) may operate to perform any or all of the functionality discussed previously with regard to the exemplary methods 200 , 300 and 400 illustrated in FIGS. 2-4 and discussed previously.
- the exemplary media system 500 includes a first communication interface module 510 .
- the first communication interface module 510 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the first communication interface module 510 is illustrated coupled to a wireless RF antenna via a wireless port 512 , the wireless medium is merely illustrative and non-limiting.
- the first communication interface module 510 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which still image-related information (e.g., still image information, information of user-selectable objects in a still image, still image information with and without embedded information of user-selectable objects) and/or other data is communicated.
- communication networks e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.
- still image-related information e.g., still image information, information of user-selectable objects in a still image, still image information with and without embedded information of user-selectable objects
- the first communication interface module 510 may operate to communicate with local sources of still image-related content or other data (e.g., disc drives, computer-readable medium readers, video or image recorders, cameras, computers, receivers, personal electronic devices, cellular telephones, personal digital assistants, personal media players, etc.). Additionally, for example, the first communication interface module 510 may operate to communicate with a remote controller (e.g., directly or via one or more intermediate communication networks).
- local sources of still image-related content or other data e.g., disc drives, computer-readable medium readers, video or image recorders, cameras, computers, receivers, personal electronic devices, cellular telephones, personal digital assistants, personal media players, etc.
- a remote controller e.g., directly or via one or more intermediate communication networks.
- the exemplary media system 500 includes a second communication interface module 520 .
- the second communication interface module 520 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the second communication interface module 520 may communicate via a wireless RF communication port 522 and antenna, or may communicate via a non-tethered optical communication port 524 (e.g., utilizing laser diodes, photodiodes, etc.).
- the second communication interface module 520 may communicate via a tethered optical communication port 526 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 528 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.).
- a tethered optical communication port 526 e.g., utilizing a fiber optic cable
- a wired communication port 528 e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.
- the second communication interface module 520 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which still image-related information (e.g., still image information, information of user-selectable objects in a still image, image information with and without embedded information of user-selectable objects) and/or other data is communicated.
- the second communication module 520 may operate to communicate with local sources of still image-related information (e.g., disc drives, computer-readable medium readers, video or image recorders, cameras, computers, receivers, personal electronic devices, cellular telephones, personal digital assistants, personal media players, etc.).
- the second communication module 520 may operate to communicate with a remote controller (e.g., directly or via one or more intervening communication networks).
- the exemplary media system 500 may also comprise additional communication interface modules, which are not illustrated (some of which may also be shown in FIG. 6 ). Such additional communication interface modules may, for example, share any or all aspects with the first 510 and second 520 communication interface modules discussed above.
- the exemplary media system 500 may also comprise a communication module 530 .
- the communication module 530 may, for example, operate to control and/or coordinate operation of the first communication interface module 510 and the second communication interface module 520 (and/or additional communication interface modules as needed).
- the communication module 530 may, for example, provide a convenient communication interface by which other components of the media system 500 may utilize the first 510 and second 520 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 530 may coordinate communications to reduce collisions and/or other interference between the communication interface modules.
- the exemplary media system 500 may additionally comprise one or more user interface modules 540 .
- the user interface module 540 may generally operate to provide user interface functionality to a user of the media system 500 .
- the user interface module 540 may operate to provide for user control of any or all standard media system commands (e.g., channel control, volume control, on/off, screen settings, input selection, etc.).
- the user interface module 540 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the media system 500 (e.g., buttons, etc.) and may also utilize the communication module 530 (and/or first 510 and second 520 communication interface modules) to communicate with other systems and/or components thereof, regarding still image-related information, regarding user interaction that occurs during the formation of combined dataset(s), etc. (e.g., a media system controller (e.g., a dedicated media system remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.)).
- the user interface module(s) 540 may operate to utilize the optional display 570 to communicate with a user regarding user-selectable object information and/or to present still image information to a user.
- the user interface module 540 may also comprise one or more sensor modules that operate to interface with and/or control operation of any of a variety of sensors that may be utilized during the performance of the combined data set(s).
- the one or more sensor modules may be utilized to ascertain an on-screen pointing location, which may for example be utilized to input and/or received user-selectable object information (e.g., to indicate and/or define user-selectable objects in a still image).
- the user interface module 540 (or sensor module(s) thereof) may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices, via the communication interface modules 510 , 520 , etc.).
- the user interface module 540 may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors.
- the user interface module 540 may perform any of a variety of still image output functions (e.g., presenting still image information to a user, presenting user-selectable object information to a user, providing visual feedback to a user regarding an identified user-selected object in a presented still image, etc.).
- the exemplary media system 500 may comprise one or more processors 550 .
- the processor 550 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc.
- the processor 550 may operate in accordance with software (or firmware) instructions.
- software or firmware instructions.
- any or all functionality discussed herein may be performed by a processor executing instructions.
- various modules are illustrated as separate blocks or modules in FIG. 5 , such illustrative modules, or a portion thereof, may be implemented by the processor 550 .
- the exemplary media system 500 may comprise one or more memories 560 . As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one or more memories 560 . Such memory 560 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation, such memory 560 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.
- memory chips e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.
- the exemplary media system 500 may comprise one or more modules 552 (e.g., still image information receiving module(s)) that operate to receive still image information for a still image.
- Such one or more modules 552 may, for example, operate to utilize the communication module 530 (e.g., and at least one of the communication interface modules 510 , 520 ) and/or the user interface module(s) 540 to receive such still image information.
- the communication module 530 e.g., and at least one of the communication interface modules 510 , 520
- the user interface module(s) 540 to receive such still image information.
- such one or more modules 552 may operate to perform step 210 of the exemplary method 200 discussed previously and/or step 310 of the exemplary method 300 discussed previously.
- the exemplary media system 500 may comprise one or more module(s) 554 (e.g., user-selectable object information receiving module(s)) that operate to receive object information corresponding to one or more user-selectable objects in a still image.
- Such one or more modules 554 may, for example, operate to utilize the communication module 530 (e.g., and at least one of the communication interface modules 510 , 520 ) and/or the user interface module(s) 540 to receive such still image user-selectable object information.
- the communication module 530 e.g., and at least one of the communication interface modules 510 , 520
- the user interface module(s) 540 to receive such still image user-selectable object information.
- such one or more modules 554 may operate to perform step 220 of the exemplary method 200 discussed previously and/or step 320 of the exemplary method 300 discussed previously.
- the exemplary media system 500 may comprise one or more modules 556 (e.g., still image and user-selectable object information combining module(s)) that operate to combine received still image information (e.g., as received by the module(s) 552 ) and received user-selectable object information (e.g., as received by the module(s) 554 ) into a combined data set.
- modules 556 may, for example, operate to receive still image information from the module(s) 552 , receive user-selectable object information from the module(s) 554 , combine such received still image information and user-selectable object information into a combined data set, and output such combined data set.
- Such one or more modules 556 may operate to perform step 230 of the exemplary method 200 discussed previously and/or step 330 of the exemplary method 300 discussed previously.
- the exemplary media system 500 may comprise one or more modules 558 (e.g., combined data set communication module(s)) that operate to communicate the combined data set to at least one recipient system and/or device.
- modules 558 e.g., combined data set communication module(s)
- such module(s) 558 may operate to utilize the communication module(s) 530 (and, for example, one or both of the first communication interface module(s) 510 and second communication interface module(s) 520 )) to communicate the combined data set.
- module(s) 558 may operate to communicate the combined data set to one or more system devices that store the combined data set on a physical medium (e.g., a non-transitory computer-readable medium).
- Such one or more modules 558 may operate to perform step 240 of the exemplary method 200 discussed previously and/or step 340 of the exemplary method 300 discussed previously.
- the exemplary media system 500 may, for example, comprise one or more modules that operate to perform any or all of the processing discussed previously with regard to the exemplary method 400 , discussed previously.
- Such modules e.g., as with the one or more modules 552 , 554 , 556 and 558 ) may be performed by the processor(s) 550 executing instructions stored in the memory 560 .
- Such module(s) may, for example comprise one or more image receiving module(s) that operate to perform the still image receiving functionality discussed previously with regard to step 410 .
- Such module(s) may also, for example comprise one or more user-selectable objection information determining module(s) that operate to perform the information determining functionality discussed previously with regard to step 420 .
- Such module(s) may additionally, for example comprise one or more user-selectable object data set formation module(s) that operate to perform the data set formation functionality discussed previously with regard to step 430 .
- Such module(s) may further, for example, comprise one or more user-selectable object data set communication module(s) that operate to perform the communication functionality discussed previously with regard to step 440 .
- the exemplary media system 500 may, for example, comprise one or more modules that operate to perform any or all of the continued processing discussed previously with regard to step 295 of the exemplary method 200 , step 395 of the exemplary method 300 , and step 495 of the exemplary method 400 discussed previously.
- modules e.g., as with the one or more modules 552 , 554 , 556 and 558 ) may be performed by the processor(s) 550 executing instructions stored in the memory 560 .
- FIG. 6 such figure is a diagram illustrating exemplary modules and/or sub-modules for a media system 600 , in accordance with various aspects of the present invention.
- the exemplary media system 600 may share any or all aspects with the media system 500 illustrated in FIG. 5 and discussed previously.
- the exemplary media system 600 may, for example, share any or all characteristics with one or more of the media system components illustrated in FIG. 1 and discussed previously.
- the exemplary media system 600 may correspond to any of the media system components illustrated in FIG. 1 (or the like) or any group of the media system components illustrated in FIG. 1 (or the like).
- the exemplary media system 600 may operate to perform any or all functionality discussed herein with regard to the exemplary method 200 illustrated in FIG. 2 , the exemplary method 300 illustrated in FIG. 3 , and the exemplary method 400 illustrated in FIG. 4 .
- the media system 600 comprises a processor 630 .
- a processor 630 may, for example, share any or all characteristics with the processor 550 discussed with regard to FIG. 5 .
- the media system 600 comprises a memory 640 .
- Such memory 640 may, for example, share any or all characteristics with the memory 560 discussed with regard to FIG. 5 .
- the media system 600 may comprise any of a variety of user interface module(s) 650 .
- Such user interface module(s) 650 may, for example, share any or all characteristics with the user interface module(s) 540 discussed previously with regard to FIG. 5 .
- the user interface module(s) 650 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen), a vibrating mechanism, a keypad, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).
- the exemplary media system 600 may also, for example, comprise any of a variety of communication modules ( 605 , 606 , and 610 ). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 510 , 520 discussed previously with regard to FIG. 5 .
- the communication interface module(s) 610 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc.
- the exemplary media system 600 is also illustrated as comprising various wired 606 and/or wireless 605 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby.
- the exemplary media system 600 may also comprise any of a variety of signal processing module(s) 690 .
- Such signal processing module(s) 690 may share any or all characteristics with modules of the exemplary media system 500 that perform signal processing.
- Such signal processing module(s) 690 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.).
- the signal processing module(s) 690 may comprise: video/graphics processing modules (e.g.
- audio processing modules e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.
- tactile processing modules e.g., Keypad I/O, touch screen processing, motor control, etc.
Abstract
Description
- This patent application is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21042US02; and U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A TELEVISION PROGRAM IN AN INFORMATION STREAM INDEPENDENT OF THE TELEVISION PROGRAM”, Attorney Docket No. 21043US02. This patent application is further related to U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21037US02; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21038US02; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21039US02; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21040US02; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21041US02; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21051US02; U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21052US02. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.
- [Not Applicable]
- [Not Applicable]
- [Not Applicable]
- Present media systems are incapable of providing for and/or conveniently providing for user-selection of objects in a still image (e.g., a photograph). Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- Various aspects of the present invention provide a system and method for providing information of selectable objects in a still image and/or data stream, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.
-
FIG. 1 is a diagram illustrating an exemplary media system, in accordance with various aspects of the present invention. -
FIG. 2 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention. -
FIG. 3 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention. -
FIG. 4 is a flow diagram illustrating an exemplary method for providing information of selectable objects in a still image in an information stream independent of the still image, in accordance with various aspects of the present invention. -
FIG. 5 is a diagram illustrating an exemplary media system, in accordance with various aspects of the present invention. -
FIG. 6 is a diagram illustrating exemplary modules and/or sub-modules for a media system, in accordance with various aspects of the present invention. - The following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- Additionally, the following discussion will refer to various media system modules (e.g., image presentation system modules, personal electronic device modules, computer system modules, camera modules, television modules, television receiver modules, television controller modules, modules of a user's local media system, modules of a geographically distributed media system, etc.). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
- The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, personal computer device, media presentation system, image presentation system, camera, media server, image server, television, television control device, television provider, television programming provider, television receiver, video and/or image recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), a general data communication network (e.g., the Internet), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- The following discussion may at times refer to an on-screen pointing location. Such a pointing location refers to a location on a video screen (e.g., a computer display, a display of a portable electronic device, a display of a digital photograph display, a television display, a primary television screen, a secondary television screen, etc.) to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing. Various aspects of the present invention, while referring to on-screen pointing location, are also readily extensible to such other forms of on-screen location identification.
- Additionally, the following discussion will at times refer to television programming. Such television programming may, for example, communicate still images. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded television programming, broadcast/multicast/unicast television programming, etc.). Such television programming may, for example, comprise real-time television broadcast programming (or multicast or unicast television programming) and/or user-stored television programming that is stored in a user device (e.g., a VCR, PVR, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a graphical video game, etc.).
- The following discussion will at times refer to still images. Such still images may, for example, comprise pictures. For example, such still images may correspond to still photographs (e.g., taken with a digital camera), scanned images created by a scanner, facsimile images, etc. Such still images may, for example, be represented in a data file (e.g., a JPEG file, a bitmap, a TIFF file, etc.), or other data structure, and may be communicated in one or more streams of data.
- Also, the following discussion will at times refer to user-selectable objects in a still image. Such user-selectable objects includes both animate (i.e., living) and inanimate (i.e., non-living) objects. Such objects may, for example, comprise characteristics of any of a variety of objects present in still images. Such objects may, for example and without limitation, comprise inanimate objects, such as consumer good objects (e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.), premises objects (e.g., business locations, stores, hotels, signs, doors, buildings, landmarks, historical sites, entertainment venues, hospitals, government buildings, etc.), objects related to services (e.g., objects related to transportation, objects related to emergency services, objects related to general government services, objects related to entertainment services, objects related to food and/or drink services, etc.), objects related to location (e.g., parks, landmarks, streets, signs, road signs, etc.), etc. Such objects may, for example, comprise animate objects, such as people (e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.), animals (e.g., pets, zoo animals, wild animals, etc.) and plants (e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.).
- Turning first to
FIG. 1 , such figure is a diagram illustrating a non-limitingexemplary media system 100 in accordance with various aspects of the present invention. Theexemplary system 100 includes amedia information provider 110. Themedia information provider 110 may, for example, comprise a server associated with a media company, a cable company, an image-providing company, movie-providing company, a news company, an educational institution, etc. Themedia information provider 110 may, for example, be an original source of still images (or related information). Also for example, themedia information provider 110 may be a communication company that provides still image distribution services (e.g., a cable media company, a satellite media company, a telecommunication company, a data network provider, etc.). Themedia information provider 110 may, for example, provide both media (e.g., still image) information and non-media information. Themedia information provider 110 may, for example, provide information of both moving picture information and still picture information. - The
media information provider 110 may, for example, provide information related to a still image (e.g., information describing or otherwise related to user-selectable objects in still images, etc.). As will be discussed below in more detail, themedia information provider 110 may operate to create and/or communicate a still image (e.g., a still image data set, still image data stream, etc.) that includes embedded information of user-selectable objects in the still image. For example and without limitation, such amedia information provider 110 may operate to receive a completed initial still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed initial still image data set. Also for example, such amedia information provider 110 may operate to form an original still image data set (e.g., a data file or other bounded data structure, etc.) and embed information of user-selectable objects in the original still image data set during such formation (e.g., in the studio, on an enterprise computing system, on a personal computer, etc.). - Note that the
media information provider 110 may be remote from a user's local image presentation system (e.g., located at a premises different from the user's premises) or may be local to the user's local image presentation system (e.g., a personal media player, a digital photo presentation system, a DVR, a personal computing device, a personal electronic device, a personal cellular telephone, a personal digital assistant, a camera, a moving picture camera, an image recorder, a data server, an image and/or television receiver, a television, etc.). - The
media information provider 110 may alternatively, for example, operate to form and/or communicate a user-selectable object data set that includes information of user-selectable objects in a still image. Such a user-selectable object data set for a still image may, for example, be independent of a data set that generally represents the still image (e.g., generally represents the still image without information of user-selectable objects in such still image). For example and without limitation, such amedia information provider 110 may operate to receive a completed still image data set (e.g., a data file or other finite group of data, a data stream, etc.), for example via a communication network and/or on a physical medium, and form the user-selectable object data set independent of the completed still image data set. Also for example, such amedia information provider 110 may operate to form both an original still image data set and form the corresponding user-selectable object data set. - The
exemplary media system 100 may also include a third partyimage information provider 120. Such a provider may, for example, provide information related to a still image. Such information may, for example, comprise information describing user-selectable objects in still images, media guide information, etc. As will be discussed below in more detail, such a third party image information provider (e.g., a party that may be independent of a still image source, media network operator, etc.) may operate to create a still image (or still image data set, still image data stream, etc.) that includes embedded information of user-selectable objects in the still image. For example and without limitation, such a third partyimage information provider 120 may operate to receive an initial completed still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the initial completed still image data set. - The
exemplary media system 100 may include one or more communication networks (e.g., the communication network(s) 130). Theexemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which still image and/or information related to still images (e.g., information related to user-selectable objects in still images) may be communicated. For example and without limitation, thecommunication network 130 may comprise characteristics of any one or more of: a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc. - The
exemplary media system 100 may include a firstmedia presentation device 140. Such a firstmedia presentation device 140 may, for example, comprise networking capability enabling suchmedia presentation device 140 to communicate directly with the communication network(s) 130. For example, the firstmedia presentation device 140 may comprise one or more embedded media receivers or transceivers (e.g., a cable television transceiver, satellite television transceiver, Internet modem, wired and/or wireless LAN transceiver, wireless PAN transceiver, etc.). Also for example, the firstmedia presentation device 140 may comprise one or more recording devices (e.g., for recording and/or playing back media content, still images, etc.). The firstmedia presentation device 140 may, for example, operate to (which includes “operate when enabled to”) perform any or all of the functionality discussed herein. The firstmedia presentation device 140 may, for example, operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects. The firstmedia presentation device 140 may also, for example, operate to receive and process information of a still image and information of user-selectable objects in the still image, where such user-selectable object information and such image information are communicated independently (e.g., received in independent data files, received in independent data streams, etc.). - The
exemplary media system 100 may include afirst media controller 160. Such afirst media controller 160 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the firstmedia presentation device 140. Thefirst media controller 160 may comprise characteristics of any of a variety of media presentation controlling devices. For example and without limitation, thefirst media controller 160 may comprise characteristics of a dedicated media center control device, a dedicated image presentation device controller, a dedicated television controller, a universal remote control, a cellular telephone or personal computing device with media presentation control capability, etc. - The
first media controller 160 may, for example, transmit signals directly to the firstmedia presentation device 140 to control operation of the firstmedia presentation device 140. Thefirst media controller 160 may also, for example, operate to transmit signals (e.g., via the communication network(s) 130) to themedia information provider 110 and/or the third partyimage information provider 120 to control image information (or information related to an image) being provided to the firstmedia presentation device 140 or other device with image presentation capability, or to conduct other transactions (e.g., business transactions, etc.). - As will be discussed in more detail later, the
first media controller 160 may operate to communicate screen (or display) pointing information with the firstmedia presentation device 140 and/or other devices. Also, as will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a display (e.g., pointing to an animate or inanimate user-selectable object presented in an image on the display). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a user device. Thefirst media controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location. - Additionally, for example in a scenario in which the
first media controller 160 comprises an on-board display, thefirst media controller 160 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects. As another example, in such a scenario, thefirst media controller 160 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., a non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently. - As will be mentioned throughout the following discussion, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local media presentation system. The first
media presentation device 140 andfirst media controller 160 provide a non-limiting example of a user's local media presentation system. Such a user's local media presentation system, for example, generally refers to the media-related devices that are local to the media presentation system currently being utilized by the user. For example, when a user is utilizing a media presentation system located at the user's home, the user's local media presentation system generally refers to the media-related devices that make up the user's home media presentation system. Also for example, when a user is utilizing a media presentation system at a premises away from the user's home (e.g., at another home, at a hotel, at an office, etc.), the user's local media presentation system generally refers to the media-related devices that make up the premises media presentation system Such a user's local media presentation system does not, for example, comprise media network infrastructure devices that are generally outside of the user's current premises (e.g., Internet nodes, cable and/or satellite head-end apparatus, cable and/or satellite communication intermediate communication network nodes) and/or media source devices that are generally managed by media enterprises and generally exist outside of the user's premises. Such entities, which may be communicatively coupled to the user's local media presentation system, may be considered to be entities remote from the user's local media presentation system (or “remote entities”). - The
exemplary media system 100 may also include a media (e.g., still image)receiver 151. Themedia receiver 151 may, for example, operate to (e.g., which may include “operate when enabled to”) provide a communication link between a media presentation device and/or media controller and a communication network and/or information provider. For example, themedia receiver 151 may operate to provide a communication link between the secondmedia presentation device 141 and the communication network(s) 130, or between the secondmedia presentation device 141 and the media information provider 110 (and/or third party image information provider 120) via the communication network(s) 130. - The
media receiver 151 may comprise characteristics of any of a variety of types of media receivers. For example and without limitation, themedia receiver 151 may comprise characteristics of a cable television receiver, a satellite television receiver, a still image receiver, a personal computer, a still picture (or still image) camera, a moving picture camera, etc. Also for example, themedia receiver 151 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). Themedia receiver 151 may also, for example, comprise recording capability (e.g., still image recording and playback, etc.). - Additionally, for example in a scenario in which the
media receiver 151 comprises an on-board display and/or provides still image information to a display (or media presentation device) communicatively coupled thereto, themedia receiver 151 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects. As another example, in such a scenario, themedia receiver 151 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently. - The
exemplary media system 100 may include asecond media controller 161. Such asecond media controller 161 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the secondmedia presentation device 141 and themedia receiver 151. Thesecond media controller 161 may comprise characteristics of any of a variety of media presentation controlling devices. For example and without limitation, thesecond media controller 161 may comprise characteristics of a dedicated media center control device, a dedicated image presentation device controller, a dedicated television controller, a universal remote control, a cellular telephone or personal computing device with media presentation control capability, etc. - The
second media controller 161 may, for example, operate to transmit signals directly to the secondmedia presentation device 141 to control operation of the secondmedia presentation device 141. Thesecond media controller 161 may, for example, operate to transmit signals directly to themedia receiver 151 to control operation of themedia receiver 151. Thesecond media controller 161 may additionally, for example, operate to transmit signals (e.g., via themedia receiver 151 and the communication network(s) 130) to the media information provider and/or the third partyimage information provider 120 to control image information (or information related to an image) being provided to themedia receiver 151, or to conduct other transactions (e.g., business transactions, etc.). - As will be discussed in more detail later, various aspects of the present invention include a user selecting a user-selectable object in an image. Such selection may, for example, comprise the user pointing to a location on a display (e.g., pointing to an animate or inanimate object presented in an image on the display). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a user device. The
second media controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location. Also, in a scenario in which thesecond media controller 161 comprises a touch screen, a user may touch a location of such touch screen to point to an on-screen location (e.g., to select a user-selectable object presented in an image presented on the touch screen). - As will be mentioned throughout the following discussion, and as mentioned previously in the discussion of the first
media presentation system 140 andfirst media controller 160, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local media system. The secondmedia presentation device 141,media receiver 151 andsecond media controller 161 provide another non-limiting example of a user's local media system. - Additionally, for example in a scenario in which the
second media controller 161 comprises an on-board display, thesecond media controller 161 may operate to receive and process still image information (e.g., via a communication network, stored on a physical medium or computer readable medium (e.g., a non-transitory computer readable medium), etc.), where such still image information comprises embedded information of user-selectable objects. As another example, in such a scenario, thesecond media controller 161 may operate to receive and process still image information and information of user-selectable objects in the still image (e.g., via one or more communication networks, stored on one or more physical media or computer readable media (e.g., non-transitory computer readable media), etc.), where such still image information and user-selectable object information are communicated independently. - The
exemplary media system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of theexemplary media system 100 unless explicitly claimed. -
FIG. 2 is a flow diagram illustrating anexemplary method 200 for providing embedded information of selectable objects in a still image, for example a still image data set (e.g., a data file or other bounded data structure, a data stream, etc.), in accordance with various aspects of the present invention. Any or all aspects of theexemplary method 200 may, for example, be implemented in a media system component (e.g., themedia information provider 110, third partyimage information provider 120, a component of acommunication network 130, firstmedia presentation device 140,first media controller 160, secondmedia presentation device 141,media receiver 151,second media controller 161, shown inFIG. 1 and discussed previously) and/or a plurality of such media system components operating in conjunction. For example, any or all aspects of theexemplary method 200 may be implemented in one or more media system components remote from the user's local media system. Also for example, any or all aspects of theexemplary method 200 may be implemented in one or more components of the user's local media system. - The
exemplary method 200 may, for example, begin executing atstep 205. Theexemplary method 200 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided. For example, theexemplary method 200 may begin executing in response to a user command to begin (e.g., a user at a media source, a user at a media production studio, a user at a media distribution enterprise, etc.), in response to still image information and/or information of user-selectable objects in a still image arriving at a system entity implementing themethod 200, in response to an electronic request communicated from the external entity to a system entity implementing themethod 200, in response to a timer, in response to a request from an end user and/or a component of a user's local media system for a still image including information of user-selectable objects, in response to a request from a user for a still image where such user is associated in a database with still images comprising user-selectable objects, upon reset and/or power-up of a system component implementing theexemplary method 200, in response to identification of a user and/or user equipment for which object selection capability is to be provided, in response to user payment of a fee, etc. - The
exemplary method 200 may, for example atstep 210, comprise receiving image information (e.g., picture information) for a still image. Various examples of such still images were provided above. Note that, depending on the particular implementation, such still image information may, for example, be received with or without information describing user-selectable objects in such a still image. - Step 210 may comprise receiving the still image information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 210 may comprise receiving the still image information from a still image broadcasting company, from a data streaming company, from a still image studio, from a still image database or server, from a camera or other still image recording device, from a scanner, from a facsimile machine, from an Internet still image provider, etc.
- Step 210 may comprise receiving the still image information via any of a variety of types of communication networks, non-limiting examples of which were provided above. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.). Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise a local wired network, point-to-point communication link between two devices, etc.
- Step 210 may comprise receiving the still image information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the
method 200. For example, in a scenario including the utilization of such hard media,step 210 may comprise receiving the still image information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network). - In an exemplary scenario, step 210 may comprise receiving a completed still image data set (e.g., a complete picture data set) for a still image, the completed still image data set formatted for communicating the still image without information describing user-selectable objects in the still image. For example, the received completed still image data set may be in conformance with a still image standard (e.g., JPEG, TIFF, GIF, bmp, etc.). For example, such a data set may be a data file (or set of logically linked data files) formatted in a JPEG or pdf format for normal presentation on a user's local image presentation system. Such a data set of a still image, when received at
step 210, might not have information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be added, as will be explained below. - In another exemplary scenario, step 210 may comprise receiving still image information (e.g., picture information) for the still image prior to the still image information being formatted into a completed still image data set for communicating the still image. In an exemplary implementation,
step 210 may comprise receiving still image information (e.g., a bitmap, partially encoded still image information, etc.) that will be formatted in accordance with a still image standard, but which has not yet been so formatted. Such a data set of a still image, when received atstep 210, might not have information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be added, as will be explained below. - In yet another exemplary scenario, step 210 may comprise receiving a completed still image data set (e.g., a complete picture data set) for the still image, the completed still image data set formatted for communicating the still image with information describing user-selectable objects in the still image. For example, the received completed still image data set may be in conformance with a still image standard (e.g., JPEG, TIFF, GIF, etc.), or a variant thereof, that specifically accommodates information of user-selectable objects in the still image. Also for example, the received completed still image (or picture) data set may be in conformance with a still image standard (e.g., JPEG et al., TIFF, GIF, JBIG et al., PNG, AGP, AI, ANI, PNG, BMP, DNG, DCS, DCR, ECW, EMF, ICO, PDF, etc.), or a variant thereof, that while not specifically accommodating information of user-selectable objects in the still image, allows for the incorporation of such information in unassigned data fields. For example, such a data set may be a data file (or set of logically linked data files) formatted in a JPEG format for normal presentation on a user's local image presentation system. Such a data set of a still image, when received at
step 210, might comprise information of user-selectable objects in the still image. Such information of user-selectable objects may then, for example, be deleted, modified and/or appended, as will be explained below. - Step 210 may, for example, comprise receiving the still image information in digital and/or analog signals. Though the examples provided above generally concerned the receipt of digital data, such examples are readily extendible to the receipt of analog still image information.
- In general,
step 210 may comprise receiving still image information for a still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of still image information or by any particular manner of receiving still image information unless explicitly claimed. - The
exemplary method 200 may, atstep 220, comprise receiving object information corresponding to a user-selectable object in the still image. Many non-limiting examples of receiving such object information will now be provided. - Step 220 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 220 may comprise receiving the user-selectable object information from a media (or image) broadcasting company, from a media (or image) streaming company, from a media (or image) studio, from a still image database or server, from an advertising company, from a commercial enterprise associated with a user-selectable object in a still image, from a person or organization associated with a user-selectable object in a still image, from an Internet still image provider, from a third party still image information source, from an end user's process executing on an end user's personal computer, etc.
- Step 220 may comprise receiving the user-selectable object information from a plurality of independent sources. For example, in an exemplary scenario in which a still image includes user-selectable objects corresponding to a plurality of respective interested parties (e.g., respective product sponsors, respective leagues or other associations, respective people, etc.),
step 220 may comprise receiving the user-selectable object information from each of such respective interested parties. For example, step 220 may comprise receiving user-selectable object information corresponding to a user-selectable consumer good in a still image from a provider of such consumer good, receiving user-selectable object information corresponding to an entertainer in the still image from the entertainer's management company, receiving user-selectable object information corresponding to a user-selectable historical landmark in the still image from a society associated with the historical landmark, receiving user-selectable object information corresponding to a user-selectable object in the still image associated with a service from a provider of such service, etc. In such a multiple-source scenario, step 220 may comprise aggregating the user-selectable object information received from the plurality of sources (e.g., into a single user-selectable object data set) for ultimate combination of such user-selectable object information with received still image information. - Step 220 may, for example, comprise receiving the user-selectable object information from a same source as that from which the still image information was received at
step 210 or may comprise receiving the user-selectable object information from a different source. For example and without limitation, step 220 may comprise receiving the user-selectable object information from an advertising company, whilestep 210 comprises receiving the still image information from a still image studio. In another example, step 220 may comprise receiving the user-selectable object information from a commercial enterprise associated with a consumer good object presented in the still image, whilestep 210 comprises receiving the still image information from an image server of a sports network. - In yet another example, step 220 may comprise receiving the user-selectable object information directly from a computer process that generates such information. For example, an operator may display a still image on an operator station and utilize graphical tools (e.g., boxes or other polygons, edge detection routines, etc.) to define a user-selectable object in the still image. Such a computer process may then output information describing the object in the still image. Step 220 may comprise receiving the information output from such process.
- Step 220 may comprise receiving the user-selectable object information via any of a variety of types of communication networks, many examples of such networks were provided previously. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.). Such networks may, for example, comprise a media network (e.g., terrestrial and/or satellite media network).
- Step 220 may, for example, comprise receiving the user-selectable object information via a same communication network as that via which the still image information was received at
step 210 or may comprise receiving the user-selectable object information from a different communication network. For example and without limitation, step 220 may comprise receiving the user-selectable object information via a general data communication network (e.g., the Internet), whilestep 210 comprises receiving the still image information via a television network. In another example, step 220 may comprise receiving the user-selectable object information via a general data network, whilestep 210 comprises receiving the still image information from a computer readable medium (e.g., a non-transitory computer readable medium). - Step 220 may comprise receiving the user-selectable object information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the
method 200. For example, in a scenario including the utilization of such hard media,step 220 may comprise receiving the user-selectable object information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network). - The object information corresponding to one or more user-selectable objects that is received at
step 220 may comprise any of a variety of characteristics, non-limiting examples of which will now be provided. - For example, such user-selectable object information may comprise information describing and/or defining the user-selectable object that is shown in the still image. Such information may, for example, be processed by a recipient of such information to identify an object that is being selected by a user. Such information may, for example, comprise information describing boundaries associated with a user-selectable object in the still image (e.g., actual object boundaries (e.g., an object outline), areas generally coinciding with a user-selectable object (e.g., a description of one or more geometric shapes that generally correspond to a user-selectable object), selection areas that when selected indicate user-selection of a user-selectable object (e.g., a superset and/or subset of a user-selectable object in the still image), etc. Such information may, for example, describe and/or define the user-selectable in a still image coordinate system.
- Many examples of such object description information are provided in a variety of related U.S. Patent Applications. For example, as mentioned previously, U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21037US02; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21038US02; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21039US02; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21040US02; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21041US02; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21051US02; and U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”, Attorney Docket No. 21052US02, which are hereby incorporated herein by reference in their entirety, provide many examples of information describing (or otherwise related to) user-selectable objects in television programming, which may also, for example, apply herein to user-selectable objects in still images.
- Also for example, such user-selectable object information may comprise information describing the object, where such information may be presented to the user upon user-selection of a user selectable object. For example, such object information may comprise information describing physical characteristics of a user-selectable object, background information, historical information, general information of interest, location information, financial information, travel information, commerce information, personal information, etc.
- Additionally for example, such user-selectable object information may comprise information describing and/or defining actions that may be taken upon user-selection of a user-selectable object, non-limiting examples of such actions and/or related information corresponding to a respective user-selectable object will now be presented.
- For example, such user-selectable object information may comprise information describing a one or more manners of determining information to present to the user (e.g., retrieving such information from a known location, conducting a search for such information, etc.), establishing a communication session by which a user may interact with networked entities associated with a user-selected object, interacting with a user regarding display of a user-selected object and/or associated information, etc.
- For example, such user-selectable object information may comprise information describing one or more manners of obtaining one or more sets of information, where such information may then, for example, be presented to the user. For example, such information may comprise a memory address (or data storage address) and/or a communication network address (e.g., an address of a networked data server, a URL, etc.), where such address may correspond to a location at which information corresponding to the identified object may be obtained. Such information may, for example, comprise a network address of a component with which a communication session may be initiated and/or conducted (e.g., to obtain information regarding the user-selected object, to interact with the user regarding the selected object, etc.).
- In an exemplary scenario in which the user-selectable object information comprises information to present to a user upon user-selection of a selectable object in a still image, such information may comprise any of a variety of different types of information related to the user-selected object. For example and without limitation, such information may comprise information describing the user-selectable object (e.g., information describing aspects of the object, history of the object, design of the object, source of the object, price of the object, critiques of the object, information provided by commercial enterprises producing and/or providing such object, etc.), information indicating to the user how the user may obtain the selected object, information indicating how the user may utilize the selected object, etc. The information may, for example, comprise information of one or more non-commercial organizations associated with, and/or having information pertaining to, the identified user-selected object (e.g., non-profit and/or government organization contact information, web site address information, etc.).
- In another exemplary scenario, the information corresponding to a user-selectable object in the still image may comprise information related to conducting a search for information corresponding to the user-selectable object. Such information may, for example, comprise network search terms that may be utilized in a search engine to search for information corresponding to the user-selected object. Such information may also comprise information describing the network boundaries of such a search, for example, identifying particular search networks, particular servers, particular addresses, particular databases, etc.
- In an exemplary scenario the information corresponding to a user-selectable object may describe a manner in which a system is to intact with a user to more clearly identify information desired by the user. For example, such information may comprise information specifying user interaction that should take place when an amount of information available and corresponding to a user-selectable object exceeds a particular threshold. Such user interaction may, for example, help to reduce the amount of information that may ultimately be presented to the user. For example, such information may comprise information describing a user interface comprising providing a list (or menu) of types of information available to the user and soliciting information from the user regarding the selection of one or more of the listed types of information.
- In yet another exemplary scenario, in which an action associated with a user-selectable object comprises the establishment and/or management of a communication session between the user and one or more networked entities, the user-selectable object information may comprise information describing the manner in which a communication session may be established and/or management.
- In still another exemplary scenario, in which an action associated with a user-selectable object comprises providing a user interface by which a user may initiate and perform a commercial transaction regarding a user-selectable object, the user-selectable object information may comprise information describing the manner in which the commercial transaction is to be performed (e.g., order forms, financial information exchange, order tracking, etc.).
- As shown above, various user-selectable objects (or types of objects) may, for example, be associated with any of a variety of respective actions that may be taken upon selection of a respective user-selectable object by a user. Such actions (e.g., information retrieval, information searching, communication session management, commercial transaction management, etc.) may, for example, be included in a table or other data structure indexed by the identity of a respective user-selectable object.
- Other non-limiting examples of object information corresponding to user-selectable objects in a still image may comprise: athlete information (e.g., statistics, personal information, professional information, history, etc.), entertainer information (e.g., personal information, discography and/or filmography information, information of related organizations, fan club information, photograph and/or video information, etc.), landmark information (e.g., historical information, visitation information, location information, mapping information, photo album information, visitation diary, charitable donation information, etc.), political figure information (e.g., party affiliation, stances on particular issues, history, financial information, voting record, attendance record, etc.), information regarding general types of objects (e.g., information describing actions to take upon user-selection of a person object, of a consumer good object, of a landmark object, etc.) and/or specific objects (e.g., information describing actions to take when a particular person object is selected, when a particular consumer good object is selected, when a particular landmark object is selected, etc.).
- For additional non-limiting examples of actions that may be performed related to user-selectable objects (e.g., in still images as well as in television programming), and related user-selectable object information that may be combined with still image information, the reader is directed to U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”, Attorney Docket No. 21045US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A LOCAL TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”, Attorney Docket No. 21046US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM BASED ON USER LOCATION”, Attorney Docket No. 21047US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”, Attorney Docket No. 21048US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”, Attorney Docket No. 21049US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM UTILIZING AN ALTERNATIVE COMMUNICATION NETWORK”, Attorney Docket No. 21050US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING ADVERTISING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”, Attorney Docket No. 21053US02; U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED PERSON IN A TELEVISION PROGRAM”, Attorney Docket No. 21054US02; and U.S. patent application Ser. No. ______, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED INFORMATION ELEMENT IN A TELEVISION PROGRAM”, Attorney Docket No. 21055US02. The entire contents of each of such applications are hereby incorporated herein by reference in their entirety.
- In general, the above-mentioned types of information corresponding to user-selectable objects a still image may be general to all eventual viewers (or recipients) of the still image, but may also be customized to a particular target user and/or end user. For example, such information may be customized to a particular user (e.g., based on income level, demographics, age, employment status and/or type, education level and/or type, family characteristics, religion, purchasing history, neighborhood characteristics, home characteristics, health characteristics, etc. For example, such information may also be customized to a particular geographical location or region.
- In general,
step 220 may comprise receiving object information corresponding to a user-selectable object in the still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of such user-selectable object information or by any particular manner of receiving such user-selectable object information unless explicitly claimed. - The
exemplary method 200 may, atstep 230, comprise combining the received still image information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Many non-limiting examples of such combining will now be provided. - As mentioned previously,
step 210 may comprise receiving still image information (e.g., a still image data set) for a still image (e.g., a photographic image) by, at least in part, receiving a completed still image data set for the still image (e.g., formatted in accordance with a still image communication and/or compression standard), where the completed still image data set is formatted for communicating (or storing) the still image without information describing user-selectable objects in the still image. In such an exemplary scenario, step 230 may comprise combining the received still image information and the received user-selectable object information by, at least in part, inserting the received user-selectable object information in the completed still image data set to create a combined data set comprising the received still image data set and the received user-selectable object information. - For example, in an exemplary scenario in which the received completed still image data set, as received, is formatted in accordance with a still image standard (e.g., a JPEG standard),
step 230 may comprise inserting the received user-selectable object information in data fields of the completed still image data set that are not assigned by the still image standard for any specific type of information (e.g., inserting such information into unassigned data fields and/or metadata fields provided by the still image standard, adding new data fields to the still image standard, etc.). - Such inserting may, for example, comprise inserting the received user-selectable object information in data fields of the completed still image data set that are interleaved with data fields carrying still image data. For example, such inserting may be performed in accordance with a format alternating still image data and user-selectable object information (or data) on a pixel-by-pixel basis (e.g., sequencing
pixel 1 still image data,pixel 1 user-selectable object information,sequencing pixel 2 still image data,pixel 2 user-selectable object information, etc.), by groups of pixels (e.g., pixel 1-A still image data, pixel 1-A user-selectable object information, pixel A-N still image data, pixel A-N user-selectable object information, etc.), by lines of pixels, by blocks of pixels, etc. Also for example, utilizing pixel, coordinate or other spatial information, user-selectable object information need not be strictly placed with the still image data for the still image in which the user-selectable object appears. For example, information of user-selectable objects in a still image and/or portion thereof may be communicated before and/or after the image data set for the entire still image is communicated. - Also for example, in another exemplary scenario in which the received completed still image data set (e.g., a picture data set), as received, is formatted in accordance with a still image standard that specifically assigns data fields to information of user-selectable objects,
step 230 may comprise inserting the received user-selectable object information in the data fields of the completed still image data set that are specifically assigned by the still image standard to contain information of user-selectable objects. - Also as mentioned previously,
step 210 may comprise receiving still image information (e.g., a still image data set) for a still image (e.g., a photographic image) by, at least in part, receiving still image information for the still image prior to the still image information being formatted into a completed still image data set for communicating (or storing) the still image. For example, such a scenario may comprise receiving information describing the still image that has yet to be formatted into a data set that conforms to a particular still image standard (e.g., bitmap information, DCT information, etc., which has yet to be placed into a self-contained JPEG data set for communicating and/or storing the still image). In such an exemplary scenario, step 230 may comprise combining the received still image information and the received user-selectable object information into a completed still image data set that is formatted for communicating and/or storing the still image with information describing user-selectable objects in the still image (e.g., into a single cohesive data set, for example, a single data file or other data structure, into a plurality of logically linked data files or other data structures, etc.). - In an exemplary scenario, such a completed still image data set may be formatted in accordance with a still image standard that specifically assigns respective data fields (or elements) to information describing the still image and to user-selectable object information. In another exemplary scenario, such a completed still image data set may be formatted in accordance with a still image standard that specifically assigns data fields to information describing a still image, but does not specifically assign data fields to user-selectable object information (e.g., utilizing general-purpose unassigned data fields, adding new data fields to the standard, etc.).
- Also as mentioned previously,
step 210 may comprise receiving still image information for a still image by, at least in part, receiving an initial combined still image data set that comprises initial still image information and initial user-selectable object information corresponding to user-selectable objects in the still image. For example, prior to being received, the received initial combined still image data set may have already been formed into a single cohesive data set that comprises the still image information for the still image and information of user-selectable objects in the still image. - In such an exemplary scenario, step 230 may comprise modifying the initial user-selectable object information of the initial combined still image data set in accordance with the received user-selectable object information (e.g., as received at step 220). Such modifying may, for example and without limitation, comprise adding the received object information to the initial object information in the initial combined still image data set (e.g., in unused unassigned data fields and/or in unused data fields that have been specifically assigned to contain user-selectable object information, etc.).
- Also such modifying may comprise changing at least a portion of the initial object information of the initial combined still image data set in accordance with the received user-selectable object information (e.g., changing information defining a user-selectable object in a presented still image, changing information about a user-selectable object to be presented to a user, changing information regarding any action that may be performed upon user-selection of a user-selectable object, etc.). Additionally, such modifying may comprise deleting at least a portion of the initial object information in accordance with the received user-selectable object information (e.g., in a scenario in which the received user-selectable object information includes a command or directive to remove a portion or all information corresponding to a particular user-selectable object).
- In the previously provided examples of combining the received still image information and the received user-selectable object information, step 230 may comprise performing such operations automatically (i.e., without real-time interaction with a user while such operations are being performed) and may also be performed with user interaction. For example, the received still image information and the received user-selectable object information may each be uniquely identified to assist in merging such information. For example, step 230 may comprise analyzing such respective unique identifications to determine the still image data set in which the user-selectable object information is to be inserted. For example, the user-selectable object information for a particular user-selectable object may comprise information indentifying the specific still image in which the user-selectable object appears. Such information may be utilized at
step 230 to determine the appropriate data set (e.g., still image data file or other bounded data set0 in which to place the user-selectable object information. - In another example, step 230 may comprise presenting an operator with a view of the still image and a view of a user-selectable object in such still image for which information is being added to a combined dataset. Step 230 may then comprise interacting with the operator to obtain permission and/or directions for combining the still image and user-selectable object information.
- Note that
step 230 may comprise encrypting the user-selectable object information or otherwise restricting access to such information. For example, in a scenario in which access to such information is provided on a subscription basis, in a scenario in which providers of such information desire to protect such information from undesirable access and/or manipulation, etc., such information protection may be beneficial. - In general,
step 230 may comprise combining the received still image information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Accordingly, the scope of various aspects of the present invention should not be limited by any particular manner of performing such combining and/or any particular format in which such a combined data set may be placed unless specifically claimed. - The
exemplary method 200 may, atstep 240, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Such communication may comprise characteristics of any of a variety of types of communication, non-limiting examples of which will now be presented. - Step 240 may, for example, comprise communicating the combined data set(s) via a communication network (e.g., a television communication network, a telecommunication network, a general data communication network (e.g., the Internet, a LAN, a PAN, etc.), etc.). Many non-limiting examples of such communication network were provided previously. Step 240 may, for example, comprise broadcasting, multi-casting and/or uni-casting the combined data set over one or more communication networks. Step 240 may also, for example, comprise communicating the combined data set(s) to another system and/or device via a direct conductive path (e.g., via a wire, circuit board trace, conductive trace on a die, etc.).
- Additionally for example, step 240 may comprise storing the combined data set(s) on a computer readable medium (e.g., a DVD, a CD, a Blueray ® disc, a laser disc, a magnetic tape, a hard drive, a diskette, etc.). Such a computer readable medium may then, for example, be shipped to a distributor and/or ultimate recipient of the computer readable medium. Further for example, step 240 may comprise storing the combined data set(s) in a volatile and/or non-volatile memory device (e.g., a flash memory device, a one-time-programmable memory device, an EEPROM, a RAM, etc.).
- Further for example, step 240 may comprise storing (or causing or otherwise participating in the storage of) the combined data set(s) in a media system component (e.g., a component or device of the user's local media (or still image presentation) system and/or a component or device of a media (or still image) provider and/or a component or device of any still image information source. For example and without limitation, step 240 may comprise storing the combined dataset(s), or otherwise participating in the storage of the combined dataset(s), in a component of the user's local media system (e.g., in an image presentation device, a digital video recorder, a media receiver, a media player, a media system controller, personal communication device, a local networked database, a local networked personal computer, etc.).
- Step 240 may, for example, comprise communicating the combined data set in serial fashion. For example, step 240 may comprise communicating the combined data set (comprising interleaved still image information and user-selectable object information) in a single data stream (e.g., via a general data network, via a television or other media network, stored on a hard medium, for example a non-transitory computer-readable medium, in such serial fashion, etc.). Also for example, step 240 may comprise communicating the combined data set in parallel data streams, each of which comprises interleaved still image information and user-selectable object information (e.g., as opposed to separate distinct respective data streams for each of still image information and user-selectable object information).
- In general,
step 240 may comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices (e.g., an end user or associated system, media (or image) provider or associated system, an advertiser or associated system, a media (or image) producer or associated system, a media (or image) database, a media (or image) server, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such communicating or by any particular recipient of such communication unless explicitly claimed. - The
exemplary method 200 may, for example atstep 295, comprise performing continued operations. Step 295 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below. For example, step 295 may comprise returning execution flow to any of the previously discussed method steps. For example, step 295 may comprise returning execution flow of theexemplary method 200 to step 220 for receiving additional user-selectable object information to combine with still image information. Also for example, step 295 may comprise returning execution flow of theexemplary method 200 to step 210 for receiving additional still image information and user-selectable object information to combine with such received still image information. Additionally for example, step 295 may comprise returning execution flow of theexemplary method 200 to step 240 for additional communication of the combined information to additional recipients. - In general,
step 295 may comprise performing continued operations (e.g., performing additional operations corresponding to combining still image information and information of user-selectable objects in such still images, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed. - Turning next to
FIG. 3 , such figure is a flow diagram illustrating anexemplary method 300 for providing embedded information of selectable objects in a still image, in accordance with various aspects of the present invention. Theexemplary method 300 may, for example, share any or all characteristics with theexemplary method 200 illustrated inFIG. 2 and discussed previously. Any or all aspects of theexemplary method 300 may, for example, be implemented in a media system component (e.g., themedia information provider 110, third partyimage information provider 120, a component of acommunication network 130, firstmedia presentation device 140,first media controller 160, secondmedia presentation device 141,media receiver 151,second media controller 161, shown inFIG. 1 and discussed previously) and/or a plurality of such media system components operating in conjunction. For example, any or all aspects of theexemplary method 300 may be implemented in one or more media (or image) system components remote from the user's local media system. Also for example, any or all aspects of theexemplary method 200 may be implemented in one or more components of the user's local media (or image) system. - The
exemplary method 300 may, for example, begin executing atstep 305. Theexemplary method 300 may begin executing in response to any of a variety of causes or conditions. Step 305 may, for example, share any or all characteristics withstep 205 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. - The
exemplary method 300 may, for example atstep 310, comprise receiving image information for a still image. Step 310 may, for example, share any or all characteristics withstep 210 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. For example, step 310 may comprise receiving any of the various types of still image information from any of the various sources of still image information via any of the various communication media discussed previously with regard to themethod 200 ofFIG. 2 and thesystem 100 ofFIG. 1 and elsewhere herein. - For example, step 310 may comprise, for example at
sub-step 312, receiving a completed still image data set for the still image, the completed still image data set formatted for communicating and/or storing the still image without information describing user-selectable objects in the still image. Alternatively for example, step 310 may comprise, for example atsub-step 314, receiving still image information for the still image prior to the still image information being formatted into a completed still image data set for communicating and/or storing the still image. Alternatively for example, step 310 may comprise, for example atsub-step 316, receiving a completed still image data set for the still image, the completed still image data set formatted for communicating and/or storing the still image with information describing user-selectable objects in the still image. - The
exemplary method 300 may, for example atstep 320, comprise receiving object information corresponding to a user-selectable object in the still image. Step 320 may, for example, share any or all characteristics withstep 220 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. For example, step 320 may comprise receiving any of the various types of user-selectable object information from any of the various sources of user-selectable object information via any of the various types of media discussed previously with regard to themethod 200 ofFIG. 2 and thesystem 100 ofFIG. 1 and elsewhere herein. - For example, step 320 may comprise, for example at
sub-step 322, receiving user-selectable object information comprising information describing and/or defining the user-selectable object that is shown in the still image (e.g., object dimension information, object movement information, etc.). Also for example, step 320 may comprise, for example atsub-step 324, receiving user-selectable object information comprising information regarding the user-selectable object that may be presented to the user upon user-selection of such object in a still image. - Additionally for example, step 320 may comprise, for example at
sub-step 326, receiving user-selectable object information comprising information describing and/or defining actions that may be taken upon user-selection of a user-selectable object (e.g., retrieving and/or obtaining and/or searching for information about a user-selectable object, information specifying a manner in which a system is to interact with a user regarding a user-selected object, searching for information, establishing and/or maintaining communication sessions, information describing the manner in which the commercial transaction is to be performed, etc.). - The
exemplary method 300 may, for example atstep 330, comprise combining the received still image information (e.g., as received at step 310) and the received user-selectable object information (e.g., as received at step 320) in a combined data set. Step 330 may, for example, share any or all characteristics withstep 230 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. - For example, step 330 may comprise, for example at
sub-step 332, inserting the received user-selectable object information in a completed still image data set that was received at step 320 (e.g., inserting such user-selectable object information in fields of the still image data set that are specified by a standard for carrying such user-selectable object information, inserting such user-selectable object information in fields of the still image data set that are not specifically allocated for a particular type of data, etc.). - Also for example, step 330 may comprise, for example at
sub-step 334, combining received still image data and received user-selectable object information into a completed still image data set that is formatted for communicating the still image with information describing user-selectable objects in the still image. Additionally for example, step 330 may comprise, for example atsub-step 336, modifying initial user-selectable object information of an initial combined still image data set in accordance with received user-selectable object information. - The
exemplary method 300 may, for example atstep 340, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Step 340 may, for example, share any or all characteristics withstep 240 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. - For example, step 340 may comprise, for example at sub-step 342, communicating the combined data set(s) via a communication network (e.g., any of a variety of communication networks discussed herein, etc.). Also for example, step 340 may comprise, for example, at sub-step 344, communicating the combined data set(s) by storing the combined data set(s) on a non-transitory computer readable medium and/or by transmitting the combined data set(s) to another device or system to perform such storage. Additionally for example, step 340 may comprise, for example, at sub-step 346, communicating the combined data set in a single serial stream (e.g., comprising interleaved still image data and user-selectable object information). Further for example, step 340 may comprise, for example, at sub-step 348, communicating the combined data set in a plurality of parallel serial streams (e.g., each of such streams comprising interleaved still image data and user-selectable object information).
- The
exemplary method 300 may, for example atstep 395, comprise performing continued operations. Step 395 may, for example, share any or all characteristics withstep 295 of theexemplary method 200 illustrated inFIG. 2 and discussed previously. - As discussed previously with regard to
FIGS. 2 and 3 , various aspects of the present invention may comprise incorporating information of user-selectable objects in a still image into a combined data set that comprises such information along with information generally descriptive of the still image, and communication of such a combined data set. As will now be discussed, various aspects of the present invention may comprise forming, communicating and/or storing a user-selectable object data set that is independent of a corresponding still image data set. -
FIG. 4 is a flow diagram illustrating anexemplary method 400 for providing information of selectable objects in a still image in an information stream independent of the still image, in accordance with various aspects of the present invention. Any or all aspects of theexemplary method 400 may, for example, be implemented in a media system component (e.g., themedia information provider 110, third partyimage information provider 120, a component of acommunication network 130, firstmedia presentation device 140,first media controller 160, secondmedia presentation device 141,media receiver 151,second media controller 161, shown inFIG. 1 and discussed previously) and/or a plurality of such media system components operating in conjunction. For example, any or all aspects of theexemplary method 400 may be implemented in one or more media system components remote from the user's local media system. Also for example, any or all aspects of theexemplary method 400 may be implemented in one or more components of the user's local media system. - The
exemplary method 400 may, for example, begin executing atstep 405. Theexemplary method 400 may begin executing in response to any of a variety of causes and/or conditions. Step 405 may, for example, share any or all characteristics withsteps exemplary methods FIGS. 2 and 3 and discussed previously. - The
exemplary method 400 may, for example atstep 410, comprise receiving image information for a still image. Step 410 may, for example, share any or all characteristics withsteps exemplary methods FIGS. 2 and 3 and discussed previously. - The
exemplary method 400 may, for example atstep 420, comprise determining user-selectable object information corresponding to one or more user-selectable objects in a still image. Step 420 may, for example, share any or all characteristics withsteps exemplary methods FIGS. 2 and 3 and discussed previously. - For example, step 420 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which were provided previously (e.g., in the discussion of
step 220 and elsewhere herein). The object information corresponding to one or more user-selectable objects that is determined at step 210 (e.g., developed by and received from a local process and/or received from an external source) may comprise any of a variety of characteristics, numerous examples of such object information were provided previously (e.g., in the discussion ofstep 220 and elsewhere herein). - The
exemplary method 400 may, atstep 430, comprise forming a user-selectable object data set comprising the determined user-selectable object information (e.g., as determined at step 420), where the user-selectable object data set is independent of a still image data set (e.g., as received at step 410) generally representative of the still image. Step 430 may comprise performing such data set formation in any of a variety of manners, non-limiting examples of which will now be presented. - For example, step 430 may comprise forming the user-selectable object data set (e.g., a data file or other data structure, a logical grouping of data, etc.) in a manner that is spatially synchronized with a still image (or a still image data set representative of a still image).
- For example, in an exemplary scenario in which a still image data set is parsed into blocks (e.g., groups of pixels),
step 430 may comprise forming the user-selectable object data set by, at least in part, parsing the user-selectable object information in a manner that logically mirrors the still image data set blocks. For example, in a scenario where a user-selectable object appears in block N of a still image, the user-selectable object information describing the user-selectable object may be placed in a corresponding block (e.g., Nth block, data segment, etc.) of the user-selectable object data set. In such a scenario, the user-selectable object data set might include null (or no) information in blocks corresponding to still image blocks that do not include any user-selectable objects. For example, the user-selectable object data set need not include information for block P if corresponding block P of the still image does not include any user-selectable objects. - As another example, in an exemplary scenario in which a still image data set is parsed into blocks, step 430 may comprise forming the user-selectable object data set by, at least in part, including information indicating the blocks of the still image in which the user-selectable object appears (e.g., along with the dimensions of the user-selectable object and/or other spatially descriptive information). For example, in an exemplary scenario in which a user-selectable object appears in blocks A-B of a still image, step 430 may comprise incorporating information into the user-selectable object data set that indicates the user-selectable object appears in blocks A-B of the still image, along with information describing the dimensions and/or locations of the user-selectable object in such blocks of the still image.
- Note that in an exemplary scenario in which the user-selectable object data set includes information that spatially synchronizes the user-selectable object data set to the still image data set, not all information of the user-selectable object data set need be so synchronized. For example, information corresponding to user-selectable objects that is not spatially-specific may be included in the user-selectable object data set in an unsynchronized (or asynchronous) manner. In an exemplary scenario, information describing user-selectable objects (or selectable regions thereof) as such user-selectable objects appear in a presented still image may be spatially-synchronized (e.g., block-synchronized) to the still image data set, while information to be presented to the user upon user-selection of such user-selectable objects and/or information describing any action to take upon user-selection of such user-selectable objects may be included in the user-selectable object data set in an unsynchronized manner (e.g., in a data structure (or sub-data structure) that is indexed by object identity to retrieve such information).
- Though the above examples were directed to spatially-based synchronization of the user-selectable object data set to the still image (e.g., a corresponding still image data set), other synchronization information may also be utilized. For example, in an exemplary scenario in which a still image is presented for a particular time window, the user-selectable object data set may comprise time synchronization information indicating that such user-selectable object data set corresponds to the particular time window. Also for example, step 430 may comprise incorporating data markers into the user-selectable object data set that correspond to respective markers in a still image data set. Additionally for example, step 430 may comprise incorporating data pointers into the user-selectable object data set that point to respective absolute and/or relative locations within a still image data set.
- The above examples generally apply to information describing the presence of user-selectable objects in the still image. As discussed previously, the user-selectable object information may also comprise information to be provided to the user upon selection of a user-selectable object, information describing communication sessions and/or other actions that may be performed upon selection of the user-selectable object, etc. Note that in particular exemplary scenarios, such information may be incorporated into the user-selectable object data set at
step 430. For example, step 430 may comprise incorporating such user-selectable object information into the user-selectable object data set in a manner that provides for indexing such information by object identity. For example, such information need only be incorporated into the user-selectable object data set one time (e.g., positioned in the user-selectable object data set such that a recipient of the user-selectable object data set will have received such information prior to user selection of the user-selectable object corresponding to such information). For example, in an exemplary scenario involving a user-selectable consumer good in a still image, step 430 may comprise forming the user-selectable object data set such that, when communicated to a user's local media (or image) presentation system, information of actions to perform upon user selection of the consumer good in the still image will have been received by the user's local media (or image) presentation system prior to the user's first opportunity to select the consumer good in the still image. - As discussed above, the user-selectable object data set formed at
step 430 may comprise characteristics of different types of data sets (or structures). For example, step 430 may comprise forming a data file that comprises the user-selectable object information. Such a user-selectable object data file may, for example, comprise metadata that correlates the user-selectable object data file to one or more corresponding still image data files that are utilized to communicate the general still image (e.g., without user-selectable object information). - Step 430 may also, for example, comprise forming an array of the user-selectable object information. Such an array may, for example, comprise an array or records associated with respective user-selectable objects in a still image and may be indexed and/or sorted by object identification. Similarly, step 430 may comprise forming a linked list of respective data records corresponding to user-selectable objects in the still image. Such a linked list may, for example, be a multi-dimensional linked list with user-selectable object in a first dimension and respective records associated with different types of information associated with a particular user-selectable object in a second dimension.
- As mentioned above, the user-selectable object data set may be independent of one or more still image data sets generally representative of the still image. Such an implementation advantageously provides for independent formation and maintenance of the user-selectable object data set that corresponds to the still image. For example, in such an implementation, a data set (e.g., a still image data file, JPEG file, etc.) for a still image may be developed (e.g., by a image studio) for communication of the still image to all users, while a data set for user-selectable objects in the still image may be developed (e.g., by an advertising company, by a sponsor, by a network operator, by one or more components of a user's local media system, etc.) independently. In such a scenario, the user-selectable object data set may be developed and/or changes may be made to the user-selectable object data set without impacting the still image data set. Also, in such a scenario, as mentioned above, user-selectable object information may be customized to a user or group of users. In such a scenario, a plurality of different user-selectable object data sets may be developed that each correspond to the same still image data set. For example, step 220 may comprise forming a first user-selectable object data set for a New York audience or recipient of a still image, and forming a second user-selectable object data set for a Los Angeles audience or recipient of the still image without necessitating modification of the still image data set, which communicates the still image in the same manner to each of the New York and Los Angeles audiences or recipients.
- In general,
step 430 may comprise forming a user-selectable object data set comprising the determined user-selectable object information (e.g., as determined at step 220), where the user-selectable object data set is independent of a still image data set generally representative of the still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular types of user-selectable object data, characteristics of particular types of user-selectable object data sets, and/or characteristics of any particular manner of forming user-selectable object data sets unless explicitly claimed. - The
exemplary method 400 may, atstep 440, comprise communicating the formed user-selectable object data set (e.g., as formed at step 430) to one or more recipients. Step 440 may comprise performing such communicating in any of a variety of manners, non-limiting examples of which will now be provided. - For example, step 440 may comprise communicating the user-selectable object data set in one or more data streams (which may be called “user-selectable object data streams” herein) independent of one or more still image data streams that generally communicate the still image (i.e., that generally communicate the still image data set). Note that, while such still image data set generally need not comprise information of user-selectable objects therein, such information may be present. For example, the user-selectable object data set may comprise information of user-selectable objects in the still image that supplement (e.g., append and/or amend) information of user-selectable objects that might be present in the still image data set.
- Step 440 may, for example, comprise communicating the user-selectable object data set time-synchronized to communication of the still image data set. For example, even in a scenario in which the user-selectable object data set is independent of the general still image data set,
step 440 may still time-synchronize communication of the user-selectable object data set with communication of the general still image data set. - For example, in such an exemplary scenario, step 440 may comprise communicating the user-selectable object data concurrently (e.g., simultaneously and/or pseudo-simultaneously in a time-sharing manner) with communication of the still image data set that generally communicates the still image. For example, such concurrent communication may comprise communicating at least a portion of the user-selectable object data set and at least a portion of the still image data set in a time-multiplexed manner (e.g., via a shared communication channel (e.g., a frequency channel, a code channel, a time/frequency channel, etc.)). Also for example, such concurrent communication may comprise communicating the user-selectable object data set in parallel with communication of the still image data set (e.g., on separate respective sets of one or more parallel communication channels).
- Also for example, step 440 may comprise communicating the user-selectable object data set via at least one communication channel that is different from one or more communication channels over which the still image data set is communicated. For example, even in a scenario in which the user-selectable object data set and the still image data set are communicated over at least one shared communication channel,
step 440 may comprise communicating the user-selectable object data set in at least one communication channel that is different from the communication channel(s) over which the still image data set is communicated. - Step 440 may, for example, comprise communicating the user-selectable object data set over a first communication network that is different from a second communication network over which the still image data set is communicated. As a non-limiting example, step 440 may comprise communicating the user-selectable object data set over a first communication network (e.g., a first general data communication network), where the still image data set is communicated over a second communication network (e.g., a second general data communication network).
- Step 440 may, for example, comprise communicating the user-selectable object data set over a first type of communication network that is different from a second type of communication network over which the still image data set is communicated. As a non-limiting example, step 440 may comprise communicating the user-selectable object data set over a first general data communication network, where the still image data set is communicated over a television communication network (e.g., a cable television network, a satellite television network, etc.).
- Also for example, step 440 may comprise communicating the user-selectable object data set utilizing a first communication protocol that is different from a second communication protocol that is utilized to communicate the still image data set. For example, step 440 may comprise communicating the user-selectable data set utilizing TCP/IP, while the general still image data set is communicated utilizing a cable television protocol.
- Also for example, step 440 may comprise communicating the user-selectable object data set to a first set of one or more user local media (or image) presentation systems, where the first set is a subset of a second set of user local media (or image) presentation systems to which the still image data set is communicated. For example, step 440 may comprise multicasting the user-selectable object data set to a multicast group, where the still image data set is broadcast to a superset of the multicast group. Also for example, step 440 may comprise unicasting the user-selectable object data set to a single user local media (or image) presentation system, where the still image data set is broadcast or multicast to a superset of the single user.
- Additionally for example, step 440 may comprise communicating the user-selectable object data set to a first set of one or more components of a user's local media (or image) presentation system, where at least a portion of such first set is different from a second set of one or more components of the user's local media (or image) presentation system to which the still image data set is communicated. For example, in a non-limiting exemplary scenario in which the still image data set is being communicated to a media receiver and a media controller of a user's local media system,
step 440 may comprise communicating the user-selectable object data set to the media controller and not to the media receiver. - Step 440 may comprise communicating the user-selectable object data set with or without regard for the timing of the communication of the still image (e.g., the still image data set) to which the user-selectable object data set corresponds. For example, step 440 may comprise communicating the user-selectable object data set whenever the still image data set is communicated. Also for example, step 440 may comprise communicating the entire user-selectable object data set before the still image data set is communicated. In such a scenario, the recipient of the communicated user-selectable object data set may be assured of having received such data set prior to receipt of the still image to which the user-selectable object data set corresponds.
- Though the previous examples generally
concerned step 440 communicating the user-selectable object data set via a communication network to one or more destination systems,step 440 may also comprise communicating the user-selectable object data set to a storage device where the user-selectable object data set is stored in a storage medium, for example an optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., non-transitory computer-readable medium, flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.), etc. Such memory may, for example, be a temporary and/or permanent component of the system entity implementing themethod 400. - In such a scenario, step 440 may comprise communicating the user-selectable object data set to a storage device where the user-selectable object data set is stored in a same storage medium as a medium on which the still image data set is stored. For example, the user-selectable object data set may be stored in one or more data structures that are independent of one or more data structures in which the still image data set is stored (e.g., stored in one or more separate data files).
- Also, in such a scenario, step 440 may comprise communicating the user-selectable object data set to one or more devices of the user's local media system (e.g., a media receiver, a digital video recorder, a media presentation device, a media controller, a personal computer, etc.) and/or one or more devices of a media source system and/or one or more devices of a media distribution system for storage in such device(s).
- In general,
step 440 may comprise communicating the formed user-selectable object data set (e.g., as formed at step 430) to one or more recipients (e.g., an end user or associated system, still image provider or associated system, an advertiser or associated system, a still image producer or associated system, a still image database, a still image server, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such communicating or by any particular recipient of such communication unless explicitly claimed. - The
exemplary method 400 may, for example atstep 495, comprise performing continued operations. Step 495 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below. For example, step 495 may comprise returning execution flow to any of the previously discussed method steps. For example, step 495 may comprise returning execution flow of theexemplary method 400 to step 420 for receiving additional user-selectable object information to form into an independent user-selectable object data set and communicate. Additionally for example, step 495 may comprise returning execution flow of theexemplary method 400 to step 440 for additional communication of the user-selectable object data set (e.g., to additional recipients). - In general,
step 495 may comprise performing continued operations (e.g., performing additional operations corresponding to forming and/or communicating user-selectable object data sets related to user-selectable objects in a still image. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed. - Turning next to
FIG. 5 , such figure is a diagram illustrating an exemplary media system 500 (e.g., single media system component and/or plurality of media system components), in accordance with various aspects of the present invention. Theexemplary media system 500 may, for example, share any or all characteristics with one or more of the media system components illustrated inFIG. 1 and discussed previously. For example, theexemplary media system 500 may correspond to any of the media system components illustrated inFIG. 1 (or the like) or any group of the media system components illustrated inFIG. 1 (or the like). Also, theexemplary media system 500 may comprise characteristics of a computing system (e.g., a personal computer, a mainframe computer, a digital signal processor, etc.). The exemplary media system 500 (e.g., various modules thereof) may operate to perform any or all of the functionality discussed previously with regard to theexemplary methods FIGS. 2-4 and discussed previously. - The
exemplary media system 500 includes a firstcommunication interface module 510. The firstcommunication interface module 510 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the firstcommunication interface module 510 is illustrated coupled to a wireless RF antenna via a wireless port 512, the wireless medium is merely illustrative and non-limiting. The firstcommunication interface module 510 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which still image-related information (e.g., still image information, information of user-selectable objects in a still image, still image information with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, the firstcommunication interface module 510 may operate to communicate with local sources of still image-related content or other data (e.g., disc drives, computer-readable medium readers, video or image recorders, cameras, computers, receivers, personal electronic devices, cellular telephones, personal digital assistants, personal media players, etc.). Additionally, for example, the firstcommunication interface module 510 may operate to communicate with a remote controller (e.g., directly or via one or more intermediate communication networks). - The
exemplary media system 500 includes a secondcommunication interface module 520. The secondcommunication interface module 520 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the secondcommunication interface module 520 may communicate via a wirelessRF communication port 522 and antenna, or may communicate via a non-tethered optical communication port 524 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the secondcommunication interface module 520 may communicate via a tethered optical communication port 526 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 528 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The secondcommunication interface module 520 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which still image-related information (e.g., still image information, information of user-selectable objects in a still image, image information with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, thesecond communication module 520 may operate to communicate with local sources of still image-related information (e.g., disc drives, computer-readable medium readers, video or image recorders, cameras, computers, receivers, personal electronic devices, cellular telephones, personal digital assistants, personal media players, etc.). Additionally, for example, thesecond communication module 520 may operate to communicate with a remote controller (e.g., directly or via one or more intervening communication networks). - The
exemplary media system 500 may also comprise additional communication interface modules, which are not illustrated (some of which may also be shown inFIG. 6 ). Such additional communication interface modules may, for example, share any or all aspects with the first 510 and second 520 communication interface modules discussed above. - The
exemplary media system 500 may also comprise acommunication module 530. Thecommunication module 530 may, for example, operate to control and/or coordinate operation of the firstcommunication interface module 510 and the second communication interface module 520 (and/or additional communication interface modules as needed). Thecommunication module 530 may, for example, provide a convenient communication interface by which other components of themedia system 500 may utilize the first 510 and second 520 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, thecommunication module 530 may coordinate communications to reduce collisions and/or other interference between the communication interface modules. - The
exemplary media system 500 may additionally comprise one or moreuser interface modules 540. Theuser interface module 540 may generally operate to provide user interface functionality to a user of themedia system 500. For example, and without limitation, theuser interface module 540 may operate to provide for user control of any or all standard media system commands (e.g., channel control, volume control, on/off, screen settings, input selection, etc.). Theuser interface module 540 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the media system 500 (e.g., buttons, etc.) and may also utilize the communication module 530 (and/or first 510 and second 520 communication interface modules) to communicate with other systems and/or components thereof, regarding still image-related information, regarding user interaction that occurs during the formation of combined dataset(s), etc. (e.g., a media system controller (e.g., a dedicated media system remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.)). In various exemplary scenarios, the user interface module(s) 540 may operate to utilize theoptional display 570 to communicate with a user regarding user-selectable object information and/or to present still image information to a user. - The
user interface module 540 may also comprise one or more sensor modules that operate to interface with and/or control operation of any of a variety of sensors that may be utilized during the performance of the combined data set(s). For example, the one or more sensor modules may be utilized to ascertain an on-screen pointing location, which may for example be utilized to input and/or received user-selectable object information (e.g., to indicate and/or define user-selectable objects in a still image). For example and without limitation, the user interface module 540 (or sensor module(s) thereof) may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices, via thecommunication interface modules user interface module 540 may perform any of a variety of still image output functions (e.g., presenting still image information to a user, presenting user-selectable object information to a user, providing visual feedback to a user regarding an identified user-selected object in a presented still image, etc.). - The
exemplary media system 500 may comprise one ormore processors 550. Theprocessor 550 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, theprocessor 550 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules inFIG. 5 , such illustrative modules, or a portion thereof, may be implemented by theprocessor 550. - The
exemplary media system 500 may comprise one ormore memories 560. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one ormore memories 560.Such memory 560 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation,such memory 560 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc. - The
exemplary media system 500 may comprise one or more modules 552 (e.g., still image information receiving module(s)) that operate to receive still image information for a still image. Such one ormore modules 552 may, for example, operate to utilize the communication module 530 (e.g., and at least one of thecommunication interface modules 510, 520) and/or the user interface module(s) 540 to receive such still image information. For example, such one ormore modules 552 may operate to performstep 210 of theexemplary method 200 discussed previously and/or step 310 of theexemplary method 300 discussed previously. - The
exemplary media system 500 may comprise one or more module(s) 554 (e.g., user-selectable object information receiving module(s)) that operate to receive object information corresponding to one or more user-selectable objects in a still image. Such one or more modules 554 may, for example, operate to utilize the communication module 530 (e.g., and at least one of thecommunication interface modules 510, 520) and/or the user interface module(s) 540 to receive such still image user-selectable object information. For example, such one or more modules 554 may operate to performstep 220 of theexemplary method 200 discussed previously and/or step 320 of theexemplary method 300 discussed previously. - The
exemplary media system 500 may comprise one or more modules 556 (e.g., still image and user-selectable object information combining module(s)) that operate to combine received still image information (e.g., as received by the module(s) 552) and received user-selectable object information (e.g., as received by the module(s) 554) into a combined data set. Such one ormore modules 556 may, for example, operate to receive still image information from the module(s) 552, receive user-selectable object information from the module(s) 554, combine such received still image information and user-selectable object information into a combined data set, and output such combined data set. Such one ormore modules 556 may operate to performstep 230 of theexemplary method 200 discussed previously and/or step 330 of theexemplary method 300 discussed previously. - The
exemplary media system 500 may comprise one or more modules 558 (e.g., combined data set communication module(s)) that operate to communicate the combined data set to at least one recipient system and/or device. For example, such module(s) 558 may operate to utilize the communication module(s) 530 (and, for example, one or both of the first communication interface module(s) 510 and second communication interface module(s) 520)) to communicate the combined data set. Also for example, such module(s) 558 may operate to communicate the combined data set to one or more system devices that store the combined data set on a physical medium (e.g., a non-transitory computer-readable medium). Such one ormore modules 558 may operate to performstep 240 of theexemplary method 200 discussed previously and/or step 340 of theexemplary method 300 discussed previously. - Though not illustrated, the
exemplary media system 500 may, for example, comprise one or more modules that operate to perform any or all of the processing discussed previously with regard to theexemplary method 400, discussed previously. Such modules (e.g., as with the one ormore modules memory 560. Such module(s) may, for example comprise one or more image receiving module(s) that operate to perform the still image receiving functionality discussed previously with regard to step 410. Such module(s) may also, for example comprise one or more user-selectable objection information determining module(s) that operate to perform the information determining functionality discussed previously with regard to step 420. Such module(s) may additionally, for example comprise one or more user-selectable object data set formation module(s) that operate to perform the data set formation functionality discussed previously with regard to step 430. Such module(s) may further, for example, comprise one or more user-selectable object data set communication module(s) that operate to perform the communication functionality discussed previously with regard to step 440. - Also, though not illustrated, the
exemplary media system 500 may, for example, comprise one or more modules that operate to perform any or all of the continued processing discussed previously with regard to step 295 of theexemplary method 200, step 395 of theexemplary method 300, and step 495 of theexemplary method 400 discussed previously. Such modules (e.g., as with the one ormore modules memory 560. - Turning next to
FIG. 6 , such figure is a diagram illustrating exemplary modules and/or sub-modules for amedia system 600, in accordance with various aspects of the present invention. Theexemplary media system 600 may share any or all aspects with themedia system 500 illustrated inFIG. 5 and discussed previously. For example, theexemplary media system 600 may, for example, share any or all characteristics with one or more of the media system components illustrated inFIG. 1 and discussed previously. For example, theexemplary media system 600 may correspond to any of the media system components illustrated inFIG. 1 (or the like) or any group of the media system components illustrated inFIG. 1 (or the like). For example, the exemplary media system 600 (or various modules thereof) may operate to perform any or all functionality discussed herein with regard to theexemplary method 200 illustrated inFIG. 2 , theexemplary method 300 illustrated inFIG. 3 , and theexemplary method 400 illustrated inFIG. 4 . - For example, the
media system 600 comprises aprocessor 630. Such aprocessor 630 may, for example, share any or all characteristics with theprocessor 550 discussed with regard toFIG. 5 . Also for example, themedia system 600 comprises amemory 640.Such memory 640 may, for example, share any or all characteristics with thememory 560 discussed with regard toFIG. 5 . - Also for example, the
media system 600 may comprise any of a variety of user interface module(s) 650. Such user interface module(s) 650 may, for example, share any or all characteristics with the user interface module(s) 540 discussed previously with regard toFIG. 5 . For example and without limitation, the user interface module(s) 650 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen), a vibrating mechanism, a keypad, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.). - The
exemplary media system 600 may also, for example, comprise any of a variety of communication modules (605, 606, and 610). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 510, 520 discussed previously with regard toFIG. 5 . For example and without limitation, the communication interface module(s) 610 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. Theexemplary media system 600 is also illustrated as comprising various wired 606 and/orwireless 605 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby. - The
exemplary media system 600 may also comprise any of a variety of signal processing module(s) 690. Such signal processing module(s) 690 may share any or all characteristics with modules of theexemplary media system 500 that perform signal processing. Such signal processing module(s) 690 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s) 690 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.). - In summary, various aspects of the present invention provide a system and method for providing information of selectable objects in a still image and/or data stream. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (31)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/881,031 US20110066929A1 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a still image file and/or data stream |
US14/488,778 US20150007222A1 (en) | 2009-09-14 | 2014-09-17 | System And Method For Providing Information Of Selectable Objects In A Television Program In An Information Stream Independent Of The Television Program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US24223409P | 2009-09-14 | 2009-09-14 | |
US12/881,031 US20110066929A1 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a still image file and/or data stream |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110066929A1 true US20110066929A1 (en) | 2011-03-17 |
Family
ID=43730008
Family Applications (34)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,221 Abandoned US20110063522A1 (en) | 2009-09-14 | 2010-05-05 | System and method for generating television screen pointing information using an external receiver |
US12/774,154 Active 2031-11-12 US9110517B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television |
US12/774,380 Active 2031-05-27 US8990854B2 (en) | 2009-09-14 | 2010-05-05 | System and method in a television for providing user-selection of objects in a television program |
US12/774,321 Active 2031-09-04 US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
US12/850,911 Active 2030-12-04 US9197941B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/851,036 Expired - Fee Related US9462345B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television system for providing for user-selection of an object in a television program |
US12/850,945 Active 2031-08-28 US9081422B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/850,866 Active 2031-02-23 US9098128B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television receiver for providing user-selection of objects in a television program |
US12/851,075 Abandoned US20110067069A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a parallel television system for providing for user-selection of an object in a television program |
US12/850,832 Abandoned US20110067047A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a distributed system for providing user-selection of objects in a television program |
US12/880,888 Active 2030-10-07 US8819732B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing information associated with a user-selected person in a television program |
US12/881,110 Active US9137577B2 (en) | 2009-09-14 | 2010-09-13 | System and method of a television for providing information associated with a user-selected information element in a television program |
US12/881,031 Abandoned US20110066929A1 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a still image file and/or data stream |
US12/880,668 Active 2031-09-11 US8832747B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US12/881,067 Active 2030-10-03 US9043833B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/881,004 Active 2031-08-19 US8931015B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US12/880,594 Active 2030-11-25 US8839307B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a local television system for responding to user-selection of an object in a television program |
US12/881,096 Active US9258617B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/880,749 Active 2030-09-26 US9110518B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US12/880,530 Abandoned US20110067054A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a distributed system for responding to user-selection of an object in a television program |
US12/880,851 Abandoned US20110067051A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing advertising information associated with a user-selected object in a television program |
US12/880,965 Active US9271044B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program |
US14/457,451 Abandoned US20150012939A1 (en) | 2009-09-14 | 2014-08-12 | System And Method In A Television System For Providing Advertising Information Associated With A User-Selected Object In A Television Program |
US14/467,408 Abandoned US20140366062A1 (en) | 2009-09-14 | 2014-08-25 | System And Method In A Television System For Providing Information Associated With A User-Selected Person In A Television Program |
US14/479,670 Abandoned US20140380381A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Based On User Location |
US14/480,020 Abandoned US20140380401A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Local Television System For Responding To User-Selection Of An Object In A Television Program |
US14/488,778 Abandoned US20150007222A1 (en) | 2009-09-14 | 2014-09-17 | System And Method For Providing Information Of Selectable Objects In A Television Program In An Information Stream Independent Of The Television Program |
US14/572,916 Abandoned US20150106857A1 (en) | 2009-09-14 | 2014-12-17 | System And Method For Generating Screen Pointing Information In A Television Control Device |
US14/603,457 Abandoned US20150135217A1 (en) | 2009-09-14 | 2015-01-23 | System And Method In A Television For Providing User-Selection Of Objects In A Television Program |
US14/625,810 Abandoned US20150172769A1 (en) | 2009-09-14 | 2015-02-19 | System And Method In A Television System For Presenting Information Associated With A User-Selected Object In A Television Program |
US14/731,983 Abandoned US20150296263A1 (en) | 2009-09-14 | 2015-06-05 | System And Method In A Television Controller For Providing User-Selection Of Objects In A Television Program |
US14/753,183 Abandoned US20150304721A1 (en) | 2009-09-14 | 2015-06-29 | System And Method In A Television Receiver For Providing User-Selection Of Objects In A Television Program |
US14/805,961 Abandoned US20150326931A1 (en) | 2009-09-14 | 2015-07-22 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Utilizing An Alternative Communication Network |
US14/851,225 Abandoned US20160007090A1 (en) | 2009-09-14 | 2015-09-11 | System And Method Of A Television For Providing Information Associated With A User-Selected Information Element In A Television Program |
Family Applications Before (12)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,221 Abandoned US20110063522A1 (en) | 2009-09-14 | 2010-05-05 | System and method for generating television screen pointing information using an external receiver |
US12/774,154 Active 2031-11-12 US9110517B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television |
US12/774,380 Active 2031-05-27 US8990854B2 (en) | 2009-09-14 | 2010-05-05 | System and method in a television for providing user-selection of objects in a television program |
US12/774,321 Active 2031-09-04 US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
US12/850,911 Active 2030-12-04 US9197941B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/851,036 Expired - Fee Related US9462345B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television system for providing for user-selection of an object in a television program |
US12/850,945 Active 2031-08-28 US9081422B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/850,866 Active 2031-02-23 US9098128B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television receiver for providing user-selection of objects in a television program |
US12/851,075 Abandoned US20110067069A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a parallel television system for providing for user-selection of an object in a television program |
US12/850,832 Abandoned US20110067047A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a distributed system for providing user-selection of objects in a television program |
US12/880,888 Active 2030-10-07 US8819732B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing information associated with a user-selected person in a television program |
US12/881,110 Active US9137577B2 (en) | 2009-09-14 | 2010-09-13 | System and method of a television for providing information associated with a user-selected information element in a television program |
Family Applications After (21)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/880,668 Active 2031-09-11 US8832747B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US12/881,067 Active 2030-10-03 US9043833B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/881,004 Active 2031-08-19 US8931015B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US12/880,594 Active 2030-11-25 US8839307B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a local television system for responding to user-selection of an object in a television program |
US12/881,096 Active US9258617B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/880,749 Active 2030-09-26 US9110518B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US12/880,530 Abandoned US20110067054A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a distributed system for responding to user-selection of an object in a television program |
US12/880,851 Abandoned US20110067051A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing advertising information associated with a user-selected object in a television program |
US12/880,965 Active US9271044B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program |
US14/457,451 Abandoned US20150012939A1 (en) | 2009-09-14 | 2014-08-12 | System And Method In A Television System For Providing Advertising Information Associated With A User-Selected Object In A Television Program |
US14/467,408 Abandoned US20140366062A1 (en) | 2009-09-14 | 2014-08-25 | System And Method In A Television System For Providing Information Associated With A User-Selected Person In A Television Program |
US14/479,670 Abandoned US20140380381A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Based On User Location |
US14/480,020 Abandoned US20140380401A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Local Television System For Responding To User-Selection Of An Object In A Television Program |
US14/488,778 Abandoned US20150007222A1 (en) | 2009-09-14 | 2014-09-17 | System And Method For Providing Information Of Selectable Objects In A Television Program In An Information Stream Independent Of The Television Program |
US14/572,916 Abandoned US20150106857A1 (en) | 2009-09-14 | 2014-12-17 | System And Method For Generating Screen Pointing Information In A Television Control Device |
US14/603,457 Abandoned US20150135217A1 (en) | 2009-09-14 | 2015-01-23 | System And Method In A Television For Providing User-Selection Of Objects In A Television Program |
US14/625,810 Abandoned US20150172769A1 (en) | 2009-09-14 | 2015-02-19 | System And Method In A Television System For Presenting Information Associated With A User-Selected Object In A Television Program |
US14/731,983 Abandoned US20150296263A1 (en) | 2009-09-14 | 2015-06-05 | System And Method In A Television Controller For Providing User-Selection Of Objects In A Television Program |
US14/753,183 Abandoned US20150304721A1 (en) | 2009-09-14 | 2015-06-29 | System And Method In A Television Receiver For Providing User-Selection Of Objects In A Television Program |
US14/805,961 Abandoned US20150326931A1 (en) | 2009-09-14 | 2015-07-22 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Utilizing An Alternative Communication Network |
US14/851,225 Abandoned US20160007090A1 (en) | 2009-09-14 | 2015-09-11 | System And Method Of A Television For Providing Information Associated With A User-Selected Information Element In A Television Program |
Country Status (4)
Country | Link |
---|---|
US (34) | US20110063522A1 (en) |
EP (1) | EP2328347A3 (en) |
CN (1) | CN102025933A (en) |
TW (1) | TW201132122A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110067064A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for presenting information associated with a user-selected object in a television program |
US20140317660A1 (en) * | 2013-04-22 | 2014-10-23 | LiveRelay Inc. | Enabling interaction between social network users during synchronous display of video channel |
US20160070892A1 (en) * | 2014-08-07 | 2016-03-10 | Click Evidence, Inc. | System and method for creating, processing, and distributing images that serve as portals enabling communication with persons who have interacted with the images |
US10104156B2 (en) * | 2014-06-10 | 2018-10-16 | Fuji Xerox Co., Ltd. | Object image information management server, recording medium, and object image information management method |
US10649666B1 (en) * | 2017-05-10 | 2020-05-12 | Ambarella International Lp | Link-list shortening logic |
Families Citing this family (145)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2596967A (en) * | 1948-11-19 | 1952-05-20 | Westinghouse Electric Corp | Fluorine-containing organosilicon compounds |
US8074248B2 (en) | 2005-07-26 | 2011-12-06 | Activevideo Networks, Inc. | System and method for providing video content associated with a source image to a television in a communication network |
US7515710B2 (en) | 2006-03-14 | 2009-04-07 | Divx, Inc. | Federated digital rights management scheme including trusted systems |
EP3145200A1 (en) | 2007-01-12 | 2017-03-22 | ActiveVideo Networks, Inc. | Mpeg objects and systems and methods for using mpeg objects |
US9826197B2 (en) | 2007-01-12 | 2017-11-21 | Activevideo Networks, Inc. | Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device |
US9455783B2 (en) | 2013-05-06 | 2016-09-27 | Federal Law Enforcement Development Services, Inc. | Network security and variable pulse wave form with continuous communication |
WO2008148050A1 (en) | 2007-05-24 | 2008-12-04 | Federal Law Enforcement Development Services, Inc. | Led light interior room and building communication system |
US9100124B2 (en) | 2007-05-24 | 2015-08-04 | Federal Law Enforcement Development Services, Inc. | LED Light Fixture |
US11265082B2 (en) | 2007-05-24 | 2022-03-01 | Federal Law Enforcement Development Services, Inc. | LED light control assembly and system |
EP2281230A1 (en) * | 2008-04-10 | 2011-02-09 | Karl Christopher Hansen | Simple-to-use optical wireless remote control |
US8775454B2 (en) | 2008-07-29 | 2014-07-08 | James L. Geer | Phone assisted ‘photographic memory’ |
US9128981B1 (en) | 2008-07-29 | 2015-09-08 | James L. Geer | Phone assisted ‘photographic memory’ |
CN105072454B (en) | 2009-01-07 | 2019-04-19 | 索尼克Ip股份有限公司 | For the specialization of the media guidance of online content, centralization, automation creation |
US8890773B1 (en) | 2009-04-01 | 2014-11-18 | Federal Law Enforcement Development Services, Inc. | Visible light transceiver glasses |
US8629938B2 (en) * | 2009-10-05 | 2014-01-14 | Sony Corporation | Multi-point television motion sensor system and method |
KR101689019B1 (en) * | 2009-11-02 | 2016-12-23 | 삼성전자주식회사 | Display apparatus for supporting a search service, User terminal for performing a search of object, and methods thereof |
EP2507995A4 (en) | 2009-12-04 | 2014-07-09 | Sonic Ip Inc | Elementary bitstream cryptographic material transport systems and methods |
NL2004780C2 (en) * | 2010-05-28 | 2012-01-23 | Activevideo Networks B V | VISUAL ELEMENT METHOD AND SYSTEM. |
US8717289B2 (en) * | 2010-06-22 | 2014-05-06 | Hsni Llc | System and method for integrating an electronic pointing device into digital image data |
US8683514B2 (en) * | 2010-06-22 | 2014-03-25 | Verizon Patent And Licensing Inc. | Enhanced media content transport stream for media content delivery systems and methods |
US8910218B2 (en) * | 2010-07-15 | 2014-12-09 | Verizon Patent And Licensing Inc. | Method and apparatus for providing control of set-top boxes |
US8330033B2 (en) * | 2010-09-13 | 2012-12-11 | Apple Inc. | Graphical user interface for music sequence programming |
WO2012039694A1 (en) * | 2010-09-21 | 2012-03-29 | Echostar Ukraine, L.L.C. | Synchronizing user interfaces of content receivers and entertainment system components |
KR20130138263A (en) | 2010-10-14 | 2013-12-18 | 액티브비디오 네트웍스, 인코포레이티드 | Streaming digital video between video devices using a cable television system |
US20120106972A1 (en) * | 2010-10-29 | 2012-05-03 | Sunrex Technology Corp. | Universal remote control |
KR20120091496A (en) * | 2010-12-23 | 2012-08-20 | 한국전자통신연구원 | A system of providing a broadcast service and a method of providing thereof |
US8914534B2 (en) | 2011-01-05 | 2014-12-16 | Sonic Ip, Inc. | Systems and methods for adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol |
CN102693061B (en) * | 2011-03-22 | 2016-06-15 | 中兴通讯股份有限公司 | Method for information display in terminal TV business, terminal and system |
EP2695388B1 (en) | 2011-04-07 | 2017-06-07 | ActiveVideo Networks, Inc. | Reduction of latency in video distribution networks using adaptive bit rates |
EP2518992A1 (en) * | 2011-04-28 | 2012-10-31 | Axel Springer Digital TV Guide GmbH | Apparatus and method for managing a personal channel |
US8185448B1 (en) | 2011-06-10 | 2012-05-22 | Myslinski Lucas J | Fact checking method and system |
US9015037B2 (en) | 2011-06-10 | 2015-04-21 | Linkedin Corporation | Interactive fact checking system |
US9087048B2 (en) | 2011-06-10 | 2015-07-21 | Linkedin Corporation | Method of and system for validating a fact checking system |
US9176957B2 (en) | 2011-06-10 | 2015-11-03 | Linkedin Corporation | Selective fact checking method and system |
US8599311B2 (en) * | 2011-07-14 | 2013-12-03 | Amimon Ltd. | Methods circuits devices and systems for transmission and display of video |
US20130036442A1 (en) * | 2011-08-05 | 2013-02-07 | Qualcomm Incorporated | System and method for visual selection of elements in video content |
MX344762B (en) | 2011-08-05 | 2016-12-15 | Fox Sports Productions Inc | Selective capture and presentation of native image portions. |
US11039109B2 (en) | 2011-08-05 | 2021-06-15 | Fox Sports Productions, Llc | System and method for adjusting an image for a vehicle mounted camera |
US9467708B2 (en) | 2011-08-30 | 2016-10-11 | Sonic Ip, Inc. | Selection of resolutions for seamless resolution switching of multimedia content |
US8909922B2 (en) | 2011-09-01 | 2014-12-09 | Sonic Ip, Inc. | Systems and methods for playing back alternative streams of protected content protected using common cryptographic information |
US8964977B2 (en) | 2011-09-01 | 2015-02-24 | Sonic Ip, Inc. | Systems and methods for saving encoded media streamed using adaptive bitrate streaming |
US20130061268A1 (en) * | 2011-09-03 | 2013-03-07 | Ariel Inventions Llc | Systems, devices, and methods for integrated searching and retrieving internet or digital content across a communication network for a multimedia platform |
US8689255B1 (en) | 2011-09-07 | 2014-04-01 | Imdb.Com, Inc. | Synchronizing video content with extrinsic data |
US20130117698A1 (en) * | 2011-10-31 | 2013-05-09 | Samsung Electronics Co., Ltd. | Display apparatus and method thereof |
JP2013123127A (en) * | 2011-12-09 | 2013-06-20 | Fujitsu Mobile Communications Ltd | User terminal and communication method |
AT512350B1 (en) * | 2011-12-20 | 2017-06-15 | Isiqiri Interface Tech Gmbh | COMPUTER PLANT AND CONTROL PROCESS THEREFOR |
US9596515B2 (en) | 2012-01-04 | 2017-03-14 | Google Inc. | Systems and methods of image searching |
US10409445B2 (en) | 2012-01-09 | 2019-09-10 | Activevideo Networks, Inc. | Rendering of an interactive lean-backward user interface on a television |
KR20130088662A (en) * | 2012-01-31 | 2013-08-08 | 한국전자통신연구원 | Apparatus, method and system for providing additional information through a digital media content |
US9800945B2 (en) | 2012-04-03 | 2017-10-24 | Activevideo Networks, Inc. | Class-based intelligent multiplexing over unmanaged networks |
US9123084B2 (en) | 2012-04-12 | 2015-09-01 | Activevideo Networks, Inc. | Graphical application integration with MPEG objects |
WO2013188590A2 (en) * | 2012-06-12 | 2013-12-19 | Realnetworks, Inc. | Context-aware video api systems and methods |
US10440432B2 (en) | 2012-06-12 | 2019-10-08 | Realnetworks, Inc. | Socially annotated presentation systems and methods |
US9800951B1 (en) | 2012-06-21 | 2017-10-24 | Amazon Technologies, Inc. | Unobtrusively enhancing video content with extrinsic data |
US8773591B1 (en) * | 2012-08-13 | 2014-07-08 | Nongqiang Fan | Method and apparatus for interacting with television screen |
US8955021B1 (en) | 2012-08-31 | 2015-02-10 | Amazon Technologies, Inc. | Providing extrinsic data for video content |
US9113128B1 (en) | 2012-08-31 | 2015-08-18 | Amazon Technologies, Inc. | Timeline interface for video content |
KR20140029049A (en) * | 2012-08-31 | 2014-03-10 | 삼성전자주식회사 | Display apparat and input signal processing method using the same |
KR20150053955A (en) | 2012-09-06 | 2015-05-19 | 인터페이즈 코퍼레이션 | Absolute and relative positioning sensor fusion in an interactive display system |
US9549217B2 (en) * | 2012-09-17 | 2017-01-17 | Echostar Technologies L.L.C. | Notification controls for television viewing |
CN103313091A (en) * | 2012-09-27 | 2013-09-18 | 中兴通讯股份有限公司 | Speed-multiplying playing method, device and system |
WO2014071307A1 (en) * | 2012-11-05 | 2014-05-08 | Velvet Ape, Inc. | Methods for targeted advertising |
US9389745B1 (en) | 2012-12-10 | 2016-07-12 | Amazon Technologies, Inc. | Providing content via multiple display devices |
US9483159B2 (en) | 2012-12-12 | 2016-11-01 | Linkedin Corporation | Fact checking graphical user interface including fact checking icons |
TW201427401A (en) * | 2012-12-18 | 2014-07-01 | Hon Hai Prec Ind Co Ltd | Television, remote controller and menu displaying method |
US9191457B2 (en) | 2012-12-31 | 2015-11-17 | Sonic Ip, Inc. | Systems, methods, and media for controlling delivery of content |
US9313510B2 (en) | 2012-12-31 | 2016-04-12 | Sonic Ip, Inc. | Use of objective quality measures of streamed content to reduce streaming bandwidth |
US10424009B1 (en) | 2013-02-27 | 2019-09-24 | Amazon Technologies, Inc. | Shopping experience using multiple computing devices |
US20140279867A1 (en) * | 2013-03-14 | 2014-09-18 | Ami Entertainment Network, Llc | Method and apparatus for providing real time television listings for venues |
US9906785B2 (en) | 2013-03-15 | 2018-02-27 | Sonic Ip, Inc. | Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata |
US10397292B2 (en) | 2013-03-15 | 2019-08-27 | Divx, Llc | Systems, methods, and media for delivery of content |
WO2014145921A1 (en) | 2013-03-15 | 2014-09-18 | Activevideo Networks, Inc. | A multiple-mode system and method for providing user selectable video content |
US9374411B1 (en) | 2013-03-21 | 2016-06-21 | Amazon Technologies, Inc. | Content recommendations using deep data |
US9094737B2 (en) | 2013-05-30 | 2015-07-28 | Sonic Ip, Inc. | Network video streaming with trick play based on separate trick play files |
US9294785B2 (en) | 2013-06-06 | 2016-03-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
US9326047B2 (en) | 2013-06-06 | 2016-04-26 | Activevideo Networks, Inc. | Overlay rendering of user interface onto source video |
US9219922B2 (en) | 2013-06-06 | 2015-12-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
US11019300B1 (en) | 2013-06-26 | 2021-05-25 | Amazon Technologies, Inc. | Providing soundtrack information during playback of video content |
US9967305B2 (en) | 2013-06-28 | 2018-05-08 | Divx, Llc | Systems, methods, and media for streaming media content |
US20150012840A1 (en) * | 2013-07-02 | 2015-01-08 | International Business Machines Corporation | Identification and Sharing of Selections within Streaming Content |
US10024971B2 (en) * | 2013-07-16 | 2018-07-17 | Walter Fields | Apparatus, system and method for locating a lost instrument or object |
KR102123062B1 (en) * | 2013-08-06 | 2020-06-15 | 삼성전자주식회사 | Method of aquiring information about contents, image display apparatus using thereof and server system of providing information about contents |
US10194189B1 (en) | 2013-09-23 | 2019-01-29 | Amazon Technologies, Inc. | Playback of content using multiple devices |
US20150095320A1 (en) | 2013-09-27 | 2015-04-02 | Trooclick France | Apparatus, systems and methods for scoring the reliability of online information |
US10169424B2 (en) | 2013-09-27 | 2019-01-01 | Lucas J. Myslinski | Apparatus, systems and methods for scoring and distributing the reliability of online information |
CN103500042B (en) * | 2013-09-30 | 2017-04-05 | 合肥京东方光电科技有限公司 | A kind of optical-touched screen and display device |
US9343112B2 (en) * | 2013-10-31 | 2016-05-17 | Sonic Ip, Inc. | Systems and methods for supplementing content from a server |
US20150117837A1 (en) * | 2013-10-31 | 2015-04-30 | Sonic Ip, Inc. | Systems and methods for supplementing content at a user device |
US20150128194A1 (en) * | 2013-11-05 | 2015-05-07 | Huawei Device Co., Ltd. | Method and mobile terminal for switching playback device |
CN103686413A (en) * | 2013-12-19 | 2014-03-26 | 宇龙计算机通信科技(深圳)有限公司 | Auxiliary display method and device |
US20150198941A1 (en) | 2014-01-15 | 2015-07-16 | John C. Pederson | Cyber Life Electronic Networking and Commerce Operating Exchange |
US20160011675A1 (en) * | 2014-02-20 | 2016-01-14 | Amchael Visual Technology Corporation | Absolute Position 3D Pointing using Light Tracking and Relative Position Detection |
US9972055B2 (en) | 2014-02-28 | 2018-05-15 | Lucas J. Myslinski | Fact checking method and system utilizing social networking information |
US8990234B1 (en) | 2014-02-28 | 2015-03-24 | Lucas J. Myslinski | Efficient fact checking method and system |
US9643722B1 (en) | 2014-02-28 | 2017-05-09 | Lucas J. Myslinski | Drone device security system |
US9838740B1 (en) | 2014-03-18 | 2017-12-05 | Amazon Technologies, Inc. | Enhancing video content with personalized extrinsic data |
US10482658B2 (en) * | 2014-03-31 | 2019-11-19 | Gary Stephen Shuster | Visualization and control of remote objects |
US10873718B2 (en) | 2014-04-02 | 2020-12-22 | Interdigital Madison Patent Holdings, Sas | Systems and methods for touch screens associated with a display |
US9866878B2 (en) | 2014-04-05 | 2018-01-09 | Sonic Ip, Inc. | Systems and methods for encoding and playing back video at different frame rates using enhancement layers |
US9788029B2 (en) | 2014-04-25 | 2017-10-10 | Activevideo Networks, Inc. | Intelligent multiplexing using class-based, multi-dimensioned decision logic for managed networks |
US9661254B2 (en) * | 2014-05-16 | 2017-05-23 | Shadowbox Media, Inc. | Video viewing system with video fragment location |
US9189514B1 (en) | 2014-09-04 | 2015-11-17 | Lucas J. Myslinski | Optimized fact checking method and system |
US10264329B2 (en) * | 2014-10-28 | 2019-04-16 | Disney Enterprises, Inc. | Descriptive metadata extraction and linkage with editorial content |
US10299012B2 (en) * | 2014-10-28 | 2019-05-21 | Disney Enterprises, Inc. | Descriptive metadata extraction and linkage with editorial content |
JP5735696B1 (en) * | 2014-11-05 | 2015-06-17 | 株式会社 ディー・エヌ・エー | GAME PROGRAM AND INFORMATION PROCESSING DEVICE |
US11758238B2 (en) | 2014-12-13 | 2023-09-12 | Fox Sports Productions, Llc | Systems and methods for displaying wind characteristics and effects within a broadcast |
US11159854B2 (en) | 2014-12-13 | 2021-10-26 | Fox Sports Productions, Llc | Systems and methods for tracking and tagging objects within a broadcast |
US10248982B2 (en) * | 2014-12-23 | 2019-04-02 | Ebay Inc. | Automated extraction of product data from production data of visual media content |
KR20160144817A (en) * | 2015-06-09 | 2016-12-19 | 삼성전자주식회사 | Display apparatus, pointing apparatus, pointing system and control methods thereof |
US20170048953A1 (en) | 2015-08-11 | 2017-02-16 | Federal Law Enforcement Development Services, Inc. | Programmable switch and system |
US10271109B1 (en) | 2015-09-16 | 2019-04-23 | Amazon Technologies, LLC | Verbal queries relative to video content |
CN105607785B (en) * | 2016-01-04 | 2019-11-12 | 京东方科技集团股份有限公司 | Touch control display system and touch control operation device |
US10021461B2 (en) * | 2016-02-29 | 2018-07-10 | Rovi Guides, Inc. | Systems and methods for performing an action based on context of a feature in a media asset |
US10110968B2 (en) | 2016-04-19 | 2018-10-23 | Google Llc | Methods, systems and media for interacting with content using a second screen device |
US20180052227A1 (en) * | 2016-08-16 | 2018-02-22 | GM Global Technology Operations LLC | Beam pattern diversity-based target location estimation |
CN106991108A (en) * | 2016-09-27 | 2017-07-28 | 阿里巴巴集团控股有限公司 | The method for pushing and device of a kind of information |
US10708488B2 (en) | 2016-09-27 | 2020-07-07 | Snap Inc. | Eyewear device mode indication |
US10762148B1 (en) * | 2016-12-19 | 2020-09-01 | Wells Fargo Bank, N.A. | Dissemination of information updates across devices |
US11134316B1 (en) | 2016-12-28 | 2021-09-28 | Shopsee, Inc. | Integrated shopping within long-form entertainment |
US10171862B2 (en) * | 2017-02-16 | 2019-01-01 | International Business Machines Corporation | Interactive video search and presentation |
US10498795B2 (en) | 2017-02-17 | 2019-12-03 | Divx, Llc | Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming |
US10678216B2 (en) * | 2017-02-28 | 2020-06-09 | Sap Se | Manufacturing process data collection and analytics |
US10440439B2 (en) | 2017-02-28 | 2019-10-08 | The Directv Group, Inc. | Method and apparatus for media content streaming and reminder notifications |
US10558197B2 (en) | 2017-02-28 | 2020-02-11 | Sap Se | Manufacturing process data collection and analytics |
CA3080472A1 (en) * | 2017-09-13 | 2019-03-21 | Source Digital, Inc. | Rules-based ancillary data |
US20190208236A1 (en) * | 2018-01-02 | 2019-07-04 | Source Digital, Inc. | Coordinates as ancillary data |
CN108123858A (en) * | 2018-01-03 | 2018-06-05 | 深圳市数视通科技股份有限公司 | A kind of Domestic News system based on the integration of three networks |
WO2019191708A1 (en) | 2018-03-30 | 2019-10-03 | Realnetworks, Inc. | Socially annotated audiovisual content |
CN108882003A (en) * | 2018-07-25 | 2018-11-23 | 安徽新华学院 | A kind of electronic software control system that can detect excellent race automatically |
CN110858914B (en) * | 2018-08-23 | 2021-11-26 | 阿里巴巴(中国)有限公司 | Video material recommendation method and device |
US11269944B2 (en) | 2018-12-14 | 2022-03-08 | Sony Interactive Entertainment LLC | Targeted gaming news and content feeds |
US11247130B2 (en) | 2018-12-14 | 2022-02-15 | Sony Interactive Entertainment LLC | Interactive objects in streaming media and marketplace ledgers |
US11896909B2 (en) | 2018-12-14 | 2024-02-13 | Sony Interactive Entertainment LLC | Experience-based peer recommendations |
US11080748B2 (en) | 2018-12-14 | 2021-08-03 | Sony Interactive Entertainment LLC | Targeted gaming news and content feeds |
US10881962B2 (en) | 2018-12-14 | 2021-01-05 | Sony Interactive Entertainment LLC | Media-activity binding and content blocking |
US11531701B2 (en) * | 2019-04-03 | 2022-12-20 | Samsung Electronics Co., Ltd. | Electronic device and control method thereof |
US20210065719A1 (en) * | 2019-08-29 | 2021-03-04 | Comcast Cable Communications, Llc | Methods and systems for intelligent content controls |
US11213748B2 (en) | 2019-11-01 | 2022-01-04 | Sony Interactive Entertainment Inc. | Content streaming with gameplay launch |
CN111552429B (en) * | 2020-04-29 | 2021-07-23 | 杭州海康威视数字技术股份有限公司 | Graph selection method and device and electronic equipment |
US11442987B2 (en) | 2020-05-28 | 2022-09-13 | Sony Interactive Entertainment Inc. | Media-object binding for displaying real-time play data for live-streaming media |
US11602687B2 (en) | 2020-05-28 | 2023-03-14 | Sony Interactive Entertainment Inc. | Media-object binding for predicting performance in a media |
US11420130B2 (en) | 2020-05-28 | 2022-08-23 | Sony Interactive Entertainment Inc. | Media-object binding for dynamic generation and displaying of play data associated with media |
US11671657B2 (en) * | 2021-06-30 | 2023-06-06 | Rovi Guides, Inc. | Method and apparatus for shared viewing of media content |
US20230010078A1 (en) * | 2021-07-12 | 2023-01-12 | Avago Technologies International Sales Pte. Limited | Object or region of interest video processing system and method |
EP4290266A1 (en) * | 2021-08-23 | 2023-12-13 | Samsung Electronics Co., Ltd. | Electronic device for controlling external electronic device and operation method thereof |
Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5111511A (en) * | 1988-06-24 | 1992-05-05 | Matsushita Electric Industrial Co., Ltd. | Image motion vector detecting apparatus |
US5408258A (en) * | 1993-04-21 | 1995-04-18 | The Arbitron Company | Method of automatically qualifying a signal reproduction device for installation of monitoring equipment |
US5543851A (en) * | 1995-03-13 | 1996-08-06 | Chang; Wen F. | Method and apparatus for translating closed caption data |
US5602568A (en) * | 1994-12-22 | 1997-02-11 | Goldstar Co., Ltd. | Point type remote control apparatus and the method thereof |
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US5718845A (en) * | 1990-12-12 | 1998-02-17 | Enichem S.P.A. | Tricyanovinyl substitution process for NLO polymers |
US5721584A (en) * | 1994-07-22 | 1998-02-24 | Sony Corporation | Two-way broadcast system and receiving system |
US5727141A (en) * | 1995-05-05 | 1998-03-10 | Apple Computer, Inc. | Method and apparatus for identifying user-selectable regions within multiple display frames |
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
US5929849A (en) * | 1996-05-02 | 1999-07-27 | Phoenix Technologies, Ltd. | Integration of dynamic universal resource locators with television presentations |
US6097441A (en) * | 1997-12-31 | 2000-08-01 | Eremote, Inc. | System for dual-display interaction with integrated television and internet content |
US6255961B1 (en) * | 1998-05-08 | 2001-07-03 | Sony Corporation | Two-way communications between a remote control unit and one or more devices in an audio/visual environment |
US6349410B1 (en) * | 1999-08-04 | 2002-02-19 | Intel Corporation | Integrating broadcast television pause and web browsing |
US20020040482A1 (en) * | 2000-04-08 | 2002-04-04 | Sextro Gary L. | Features for interactive television |
US20020042925A1 (en) * | 2000-07-24 | 2002-04-11 | Koji Ebisu | Television receiver, receiver and program execution method |
US20020056136A1 (en) * | 1995-09-29 | 2002-05-09 | Wistendahl Douglass A. | System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box |
US20020069405A1 (en) * | 2000-09-20 | 2002-06-06 | Chapin Paul W. | System and method for spokesperson interactive television advertisements |
US6407779B1 (en) * | 1999-03-29 | 2002-06-18 | Zilog, Inc. | Method and apparatus for an intuitive universal remote control system |
US20020078446A1 (en) * | 2000-08-30 | 2002-06-20 | Jon Dakss | Method and apparatus for hyperlinking in a television broadcast |
US20020090114A1 (en) * | 1995-07-27 | 2002-07-11 | Rhoads Geoffrey B. | Watermark enabled video objects |
US20020120934A1 (en) * | 2001-02-28 | 2002-08-29 | Marc Abrahams | Interactive television browsing and buying method |
US20020136432A1 (en) * | 2000-11-10 | 2002-09-26 | Hiroyuki Koike | Method and apparatus for processing information of an object |
US20030005445A1 (en) * | 1995-10-02 | 2003-01-02 | Schein Steven M. | Systems and methods for linking television viewers with advertisers and broadcasters |
US20030023981A1 (en) * | 2001-07-25 | 2003-01-30 | Thomas Lemmons | Method and apparatus for transmission of interactive and enhanced television data |
US20030035075A1 (en) * | 2001-08-20 | 2003-02-20 | Butler Michelle A. | Method and system for providing improved user input capability for interactive television |
US20030051253A1 (en) * | 2001-08-16 | 2003-03-13 | Barone Samuel T. | Interactive television tracking system |
US20030054878A1 (en) * | 2001-09-20 | 2003-03-20 | International Game Technology | Point of play registration on a gaming machine |
US6538672B1 (en) * | 1999-02-08 | 2003-03-25 | Koninklijke Philips Electronics N.V. | Method and apparatus for displaying an electronic program guide |
US20030115602A1 (en) * | 1995-06-07 | 2003-06-19 | Knee Robert Alan | Electronic television program guide schedule system and method with data feed access |
US20030145326A1 (en) * | 2002-01-31 | 2003-07-31 | Koninklijke Philips Electronics N.V. | Subscription to TV channels/shows based on recommendation generated by a TV recommender |
US20040078814A1 (en) * | 2002-03-29 | 2004-04-22 | Digeo, Inc. | Module-based interactive television ticker |
US20040119701A1 (en) * | 2002-12-19 | 2004-06-24 | Mulligan Roger C. | Lattice touch-sensing system |
US20040167855A1 (en) * | 2001-04-26 | 2004-08-26 | Cambridge Vivien Johan | Automatic billing system for remote internet services |
US20050028208A1 (en) * | 1998-07-17 | 2005-02-03 | United Video Properties, Inc. | Interactive television program guide with remote access |
US20050086690A1 (en) * | 2003-10-16 | 2005-04-21 | International Business Machines Corporation | Interactive, non-intrusive television advertising |
US20050132420A1 (en) * | 2003-12-11 | 2005-06-16 | Quadrock Communications, Inc | System and method for interaction with television content |
US20050137958A1 (en) * | 2003-12-23 | 2005-06-23 | Thomas Huber | Advertising methods for advertising time slots and embedded objects |
US20050153687A1 (en) * | 2004-01-13 | 2005-07-14 | Nokia Corporation | Providing location information |
US6931660B1 (en) * | 2000-01-28 | 2005-08-16 | Opentv, Inc. | Interactive television system and method for simultaneous transmission and rendering of multiple MPEG-encoded video streams |
US20060037044A1 (en) * | 1993-03-29 | 2006-02-16 | Microsoft Corporation | Pausing television programming in response to selection of hypertext link |
US20060099964A1 (en) * | 2004-11-05 | 2006-05-11 | Ebay Inc. | System and method for location based content correlation |
US7057670B2 (en) * | 2000-04-27 | 2006-06-06 | Dan Kikinis | Cursor control system |
US20060152489A1 (en) * | 2005-01-12 | 2006-07-13 | John Sweetser | Handheld vision based absolute pointing system |
US20060174273A1 (en) * | 2004-11-20 | 2006-08-03 | Samsung Electronics Co., Ltd. | Method of displaying service in DMB, and method and apparatus for managing preferred service |
US20060195878A1 (en) * | 2000-04-12 | 2006-08-31 | Lg Electronics Inc. | Apparatus and method for providing and obtaining product information through a broadcast signal |
US7158676B1 (en) * | 1999-02-01 | 2007-01-02 | Emuse Media Limited | Interactive system |
US20070097275A1 (en) * | 2001-09-27 | 2007-05-03 | Universal Electronics Inc. | Two way communication using light links |
US20070130581A1 (en) * | 2000-02-02 | 2007-06-07 | Del Sesto Eric E | Interactive content delivery methods and apparatus |
US20070137611A1 (en) * | 2005-12-21 | 2007-06-21 | Yu Robert C | Active radical initiator for internal combustion engines |
US20070156521A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Systems and methods for commerce in media program related merchandise |
US20070157260A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Interactive media guidance system having multiple devices |
US20070195205A1 (en) * | 2006-02-21 | 2007-08-23 | Lowe Jerry B | Remote control system and method |
US20070199014A1 (en) * | 2006-02-22 | 2007-08-23 | E-Cast, Inc. | Consumer portal |
US20080016526A1 (en) * | 2000-03-09 | 2008-01-17 | Asmussen Michael L | Advanced Set Top Terminal Having A Program Pause Feature With Voice-to-Text Conversion |
US20080052750A1 (en) * | 2006-08-28 | 2008-02-28 | Anders Grunnet-Jepsen | Direct-point on-demand information exchanges |
US20080066129A1 (en) * | 2000-02-29 | 2008-03-13 | Goldpocket Interactive, Inc. | Method and Apparatus for Interaction with Hyperlinks in a Television Broadcast |
US20080066097A1 (en) * | 2004-10-13 | 2008-03-13 | Woodhyun Park | Method Of Realizing Interactive Advertisement Under Digital Braodcasting Environment By Extending Program Associated Data-Broadcasting To Internet Area |
US7344084B2 (en) * | 2005-09-19 | 2008-03-18 | Sony Corporation | Portable video programs |
US7360232B2 (en) * | 2001-04-25 | 2008-04-15 | Diego, Inc. | System and method to subscribe to channel URL addresses and to provide non-programming-related URL addresses in an interactive video casting system |
US20080109851A1 (en) * | 2006-10-23 | 2008-05-08 | Ashley Heather | Method and system for providing interactive video |
US20080132163A1 (en) * | 2000-05-31 | 2008-06-05 | Optinetix (Israel) Ltd. | Systems and methods for distributing information through broadcast media |
US20080136754A1 (en) * | 2006-12-06 | 2008-06-12 | Sony Corporation | Display apparatus, display-apparatus control method and program |
US20080172693A1 (en) * | 2007-01-16 | 2008-07-17 | Microsoft Corporation | Representing Television Programs Using Video Objects |
US20080172252A1 (en) * | 2007-01-17 | 2008-07-17 | Mitochon Systems, Inc | Apparatus and Method for Revenue Distribution Generated From Delivering Healthcare Advertisements Via EMR Systems, RHIN, and Electronic Advertising Servers |
US20080177570A1 (en) * | 2007-01-18 | 2008-07-24 | Ari Craine | Methods, Systems, and Computer-Readable Media for Disease Management |
US7409437B2 (en) * | 1996-03-08 | 2008-08-05 | Actv, Inc. | Enhanced video programming system and method for incorporating and displaying retrieved integrated Internet information segments |
US20080209480A1 (en) * | 2006-12-20 | 2008-08-28 | Eide Kurt S | Method for enhanced video programming system for integrating internet data for on-demand interactive retrieval |
US20080204603A1 (en) * | 2007-02-27 | 2008-08-28 | Hideharu Hattori | Video displaying apparatus and video displaying method |
US20080204605A1 (en) * | 2007-02-28 | 2008-08-28 | Leonard Tsai | Systems and methods for using a remote control unit to sense television characteristics |
US20090021473A1 (en) * | 2002-12-08 | 2009-01-22 | Grant Danny A | Haptic Communication Devices |
US20090037947A1 (en) * | 2007-07-30 | 2009-02-05 | Yahoo! Inc. | Textual and visual interactive advertisements in videos |
US20090077394A1 (en) * | 2007-09-17 | 2009-03-19 | Jr-Shian Tsai | Techniques for communications based power management |
US20090083815A1 (en) * | 2007-09-19 | 2009-03-26 | Mcmaster Orlando | Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time |
US20090113475A1 (en) * | 2007-08-21 | 2009-04-30 | Yi Li | Systems and methods for integrating search capability in interactive video |
US7535456B2 (en) * | 2004-04-30 | 2009-05-19 | Hillcrest Laboratories, Inc. | Methods and devices for removing unintentional movement in 3D pointing devices |
US7536706B1 (en) * | 1998-08-24 | 2009-05-19 | Sharp Laboratories Of America, Inc. | Information enhanced audio video encoding system |
US20090165041A1 (en) * | 2007-12-21 | 2009-06-25 | Penberthy John S | System and Method for Providing Interactive Content with Video Content |
US20090165048A1 (en) * | 2007-12-19 | 2009-06-25 | United Video Properties, Inc. | Methods and devices for presenting guide listings and guidance data in three dimensions in an interactive media guidance application |
US20090187862A1 (en) * | 2008-01-22 | 2009-07-23 | Sony Corporation | Method and apparatus for the intuitive browsing of content |
US20090199259A1 (en) * | 2001-02-02 | 2009-08-06 | Rachad Alao | Service gateway for interactive television |
US20090217317A1 (en) * | 2008-02-26 | 2009-08-27 | At&T Knowledge Ventures, L.P. | System and method for promoting marketable items |
US20100064320A1 (en) * | 2006-03-13 | 2010-03-11 | Verizon Services Corp. | Integrating data on program popularity into an on-screen program guide |
US20100098074A1 (en) * | 2008-10-22 | 2010-04-22 | Backchannelmedia Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US20100162303A1 (en) * | 2008-12-23 | 2010-06-24 | Cassanova Jeffrey P | System and method for selecting an object in a video data stream |
US20100157152A1 (en) * | 2008-12-18 | 2010-06-24 | Thomson Licensing | Display device with feedback elements and method for monitoring |
US20100218228A1 (en) * | 2009-02-20 | 2010-08-26 | Walter Edward A | System and method for processing image objects in video data |
US7889175B2 (en) * | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
US7890380B2 (en) * | 2007-05-07 | 2011-02-15 | At&T Intellectual Property I, L.P. | Method, system, and computer readable medium for implementing sales of products using a trace of an object |
US20110067069A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a parallel television system for providing for user-selection of an object in a television program |
US20110141013A1 (en) * | 2009-12-14 | 2011-06-16 | Alcatel-Lucent Usa, Incorporated | User-interface apparatus and method for user control |
US20110179435A1 (en) * | 2005-12-29 | 2011-07-21 | Charles Cordray | Systems and methods for managing content |
US7987478B2 (en) * | 2007-08-28 | 2011-07-26 | Sony Ericsson Mobile Communications Ab | Methods, devices, and computer program products for providing unobtrusive video advertising content |
US8095423B2 (en) * | 2006-03-17 | 2012-01-10 | Grant Allen Lee Nichols | Interactive international bulk trade television |
US8181212B2 (en) * | 2008-10-30 | 2012-05-15 | Frederic Sigal | Method of providing a frame-based object redirection overlay for a video stream |
US20120154268A1 (en) * | 2007-05-14 | 2012-06-21 | Apple Inc. | Remote control systems that can distinguish stray light sources |
US20120163776A1 (en) * | 1998-11-02 | 2012-06-28 | United Video Properties, Inc. | Interactive program guide with continuous data stream and client-server data supplementation |
US8223136B2 (en) * | 2005-06-07 | 2012-07-17 | Intel Corporation | Error detection and prevention inacoustic data |
US8359628B2 (en) * | 2007-10-05 | 2013-01-22 | Sony Corporation | Display device and transmitting device |
US8421746B2 (en) * | 2006-09-07 | 2013-04-16 | Porto Vinci Ltd. Limited Liability Company | Device control using multi-dimensional motion sensing and a wireless home entertainment hub |
Family Cites Families (119)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7207053B1 (en) | 1992-12-09 | 2007-04-17 | Sedna Patent Services, Llc | Method and apparatus for locally targeting virtual objects within a terminal |
US5784056A (en) * | 1995-12-29 | 1998-07-21 | Sun Microsystems, Inc. | System and method for temporally varying pointer icons |
US20030212996A1 (en) * | 1996-02-08 | 2003-11-13 | Wolzien Thomas R. | System for interconnection of audio program data transmitted by radio to remote vehicle or individual with GPS location |
US5661502A (en) * | 1996-02-16 | 1997-08-26 | Ast Research, Inc. | Self-adjusting digital filter for smoothing computer mouse movement |
US5894843A (en) * | 1996-02-20 | 1999-04-20 | Cardiothoracic Systems, Inc. | Surgical method for stabilizing the beating heart during coronary artery bypass graft surgery |
US6006256A (en) * | 1996-03-11 | 1999-12-21 | Opentv, Inc. | System and method for inserting interactive program content within a television signal originating at a remote network |
US6057831A (en) | 1996-08-14 | 2000-05-02 | Samsung Electronics Co., Ltd. | TV graphical user interface having cursor position indicator |
GB2358779B (en) * | 1996-12-13 | 2001-10-10 | Ibm | System, method, and pointing device for remote operation of data processing apparatus |
US6256785B1 (en) * | 1996-12-23 | 2001-07-03 | Corporate Media Patners | Method and system for providing interactive look-and-feel in a digital broadcast via an X-Y protocol |
US7617508B2 (en) * | 2003-12-12 | 2009-11-10 | At&T Intellectual Property I, L.P. | Methods and systems for collaborative capture of television viewer generated clickstreams |
KR100288976B1 (en) | 1997-01-08 | 2001-05-02 | 윤종용 | Method for constructing and recognizing menu commands of television receiver |
US6317714B1 (en) | 1997-02-04 | 2001-11-13 | Microsoft Corporation | Controller and associated mechanical characters operable for continuously performing received control data while engaging in bidirectional communications over a single communications channel |
US6045588A (en) * | 1997-04-29 | 2000-04-04 | Whirlpool Corporation | Non-aqueous washing apparatus and method |
US7809138B2 (en) | 1999-03-16 | 2010-10-05 | Intertrust Technologies Corporation | Methods and apparatus for persistent control and protection of content |
BR9912386A (en) | 1998-07-23 | 2001-10-02 | Diva Systems Corp | System and process for generating and using an interactive user interface |
TW463503B (en) * | 1998-08-26 | 2001-11-11 | United Video Properties Inc | Television chat system |
US6357042B2 (en) * | 1998-09-16 | 2002-03-12 | Anand Srinivasan | Method and apparatus for multiplexing separately-authored metadata for insertion into a video data stream |
US6626570B2 (en) * | 1998-10-16 | 2003-09-30 | Kenneth Fox Supply Company | Produce bag with draw top |
US6532592B1 (en) | 1998-11-09 | 2003-03-11 | Sony Corporation | Bi-directional remote control unit and method of using the same |
US6314569B1 (en) | 1998-11-25 | 2001-11-06 | International Business Machines Corporation | System for video, audio, and graphic presentation in tandem with video/audio play |
US6282713B1 (en) | 1998-12-21 | 2001-08-28 | Sony Corporation | Method and apparatus for providing on-demand electronic advertising |
US6122660A (en) | 1999-02-22 | 2000-09-19 | International Business Machines Corporation | Method for distributing digital TV signal and selection of content |
US7102616B1 (en) | 1999-03-05 | 2006-09-05 | Microsoft Corporation | Remote control device with pointing capacity |
US8479251B2 (en) | 1999-03-31 | 2013-07-02 | Microsoft Corporation | System and method for synchronizing streaming content with enhancing content using pre-announced triggers |
JP4497577B2 (en) | 1999-04-05 | 2010-07-07 | キヤノン株式会社 | Multi-beam optical scanning device |
US7325245B1 (en) * | 1999-09-30 | 2008-01-29 | Intel Corporation | Linking to video information |
EP1190298A2 (en) * | 1999-12-22 | 2002-03-27 | Koninklijke Philips Electronics N.V. | Pointer coordinates assignment |
GB0004811D0 (en) | 2000-03-01 | 2000-04-19 | Pace Micro Tech Plc | Improvements relating to braodcast date receiving apparatus |
US7979881B1 (en) | 2000-03-30 | 2011-07-12 | Microsoft Corporation | System and method for identifying audio/visual programs to be recorded |
JP2003529844A (en) * | 2000-03-31 | 2003-10-07 | ユナイテッド ビデオ プロパティーズ, インコーポレイテッド | System and method for advertising linked by metadata |
US8205223B2 (en) | 2000-04-12 | 2012-06-19 | Lg Electronics Inc. | Method and video device for accessing information |
US20050193425A1 (en) * | 2000-07-24 | 2005-09-01 | Sanghoon Sull | Delivery and presentation of content-relevant information associated with frames of audio-visual programs |
US7103908B2 (en) | 2000-07-25 | 2006-09-05 | Diego, Inc. | Method and system to save context for deferred transaction via interactive television |
US20020056109A1 (en) * | 2000-07-25 | 2002-05-09 | Tomsen Mai-Lan | Method and system to provide a personalized shopping channel VIA an interactive video casting system |
JP2002057645A (en) * | 2000-08-10 | 2002-02-22 | Ntt Docomo Inc | Method for data transfer and mobile unit server |
WO2002025556A1 (en) * | 2000-09-21 | 2002-03-28 | Digital Network Shopping, Llc | Method and apparatus for digital shopping |
US6920244B2 (en) * | 2000-10-06 | 2005-07-19 | Rochester Institute Of Technology | Data-efficient and self adapting imaging spectrometry method and an apparatus thereof |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US20020069404A1 (en) * | 2000-11-28 | 2002-06-06 | Navic Systems, Incorporated | Targeted promotion deployment |
US20050039214A1 (en) * | 2001-02-21 | 2005-02-17 | Lorenz Kim E. | System and method for providing direct, context-sensitive customer support in an interactive television system |
WO2003012744A1 (en) | 2001-08-02 | 2003-02-13 | Intellocity Usa, Inc. | Post production visual alterations |
KR100846761B1 (en) | 2001-09-11 | 2008-07-16 | 삼성전자주식회사 | Pointer control method, pointing apparatus and host apparatus therefor |
US20030079224A1 (en) | 2001-10-22 | 2003-04-24 | Anton Komar | System and method to provide additional information associated with selectable display areas |
US20030182393A1 (en) * | 2002-03-25 | 2003-09-25 | Sony Corporation | System and method for retrieving uniform resource locators from television content |
US20050177861A1 (en) | 2002-04-05 | 2005-08-11 | Matsushita Electric Industrial Co., Ltd | Asynchronous integration of portable handheld device |
US6967566B2 (en) * | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US7725398B2 (en) | 2002-06-19 | 2010-05-25 | Eastman Kodak Company | Method and system for selling goods and/or services over a communication network between multiple users |
US7266835B2 (en) | 2002-06-27 | 2007-09-04 | Digeo, Inc. | Method and apparatus for secure transactions in an interactive television ticker |
US20040095317A1 (en) * | 2002-11-20 | 2004-05-20 | Jingxi Zhang | Method and apparatus of universal remote pointing control for home entertainment system and computer |
AU2003293240A1 (en) | 2002-12-02 | 2004-06-23 | Matsushita Electric Industrial Co., Ltd. | Portable device for viewing real-time synchronized information from broadcasting sources |
EP1463052A1 (en) * | 2003-03-25 | 2004-09-29 | Deutsche Thomson-Brandt Gmbh | Method for representing animated menu buttons |
US20040221025A1 (en) * | 2003-04-29 | 2004-11-04 | Johnson Ted C. | Apparatus and method for monitoring computer networks |
JP2004347320A (en) * | 2003-05-15 | 2004-12-09 | Advantest Corp | Display and method for measuring and displaying signal |
US7053965B1 (en) | 2003-06-10 | 2006-05-30 | Fan Nong-Qiang | Remote control for controlling a computer using a screen of a television |
US8635643B2 (en) | 2003-06-30 | 2014-01-21 | At&T Intellectual Property I, L.P. | System and method for providing interactive media content over a network |
CN1706178B (en) * | 2003-09-12 | 2010-10-06 | 松下电器产业株式会社 | Image displaying apparatus and method |
US8286203B2 (en) | 2003-12-19 | 2012-10-09 | At&T Intellectual Property I, L.P. | System and method for enhanced hot key delivery |
JP4192819B2 (en) | 2004-03-19 | 2008-12-10 | ソニー株式会社 | Information processing apparatus and method, recording medium, and program |
US20050229227A1 (en) | 2004-04-13 | 2005-10-13 | Evenhere, Inc. | Aggregation of retailers for televised media programming product placement |
US20050234782A1 (en) * | 2004-04-14 | 2005-10-20 | Schackne Raney J | Clothing and model image generation, combination, display, and selection |
US20050251835A1 (en) | 2004-05-07 | 2005-11-10 | Microsoft Corporation | Strategies for pausing and resuming the presentation of programs |
US20050289593A1 (en) * | 2004-05-26 | 2005-12-29 | Skipjam Corp. | Method and system for displaying and selecting content of an electronic program guide |
US7335456B2 (en) * | 2004-05-27 | 2008-02-26 | International Business Machines Corporation | Top coat material and use thereof in lithography processes |
US7542072B2 (en) * | 2004-07-28 | 2009-06-02 | The University Of Maryland | Device using a camera and light polarization for the remote displacement of a cursor on a display |
TWI236289B (en) * | 2004-08-11 | 2005-07-11 | Pixart Imaging Inc | Interactive device capable of improving image processing |
WO2006018775A2 (en) * | 2004-08-12 | 2006-02-23 | Philips Intellectual Property & Standards Gmbh | Method and system for controlling a display |
US20070266406A1 (en) | 2004-11-09 | 2007-11-15 | Murali Aravamudan | Method and system for performing actions using a non-intrusive television with reduced text input |
US7576757B2 (en) * | 2004-11-24 | 2009-08-18 | General Electric Company | System and method for generating most read images in a PACS workstation |
JP2006260028A (en) * | 2005-03-16 | 2006-09-28 | Sony Corp | Remote control system, remote controller, remote control method, information processor, information processing method and program |
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
US20060259930A1 (en) * | 2005-05-10 | 2006-11-16 | Rothschild Leigh M | System and method for obtaining information on digital media content |
US20060268895A1 (en) | 2005-05-17 | 2006-11-30 | Kotzin Michael D | Linking a mobile wireless communication device to a proximal consumer broadcast device |
US7814022B2 (en) | 2005-06-10 | 2010-10-12 | Aniruddha Gupte | Enhanced media method and apparatus for use in digital distribution system |
US20070150368A1 (en) * | 2005-09-06 | 2007-06-28 | Samir Arora | On-line personalized content and merchandising environment |
US20070078732A1 (en) * | 2005-09-14 | 2007-04-05 | Crolley C W | Interactive information access system |
JP4453647B2 (en) * | 2005-10-28 | 2010-04-21 | セイコーエプソン株式会社 | Moving image display device and moving image display method |
JP2007251446A (en) * | 2006-03-15 | 2007-09-27 | Sharp Corp | Receiving apparatus, and receiving system |
EP2011017A4 (en) | 2006-03-30 | 2010-07-07 | Stanford Res Inst Int | Method and apparatus for annotating media streams |
US8789100B2 (en) * | 2006-05-31 | 2014-07-22 | Telecom Italia S.P.A. | Method and TV receiver for storing contents associated to TV programs |
US8261300B2 (en) | 2006-06-23 | 2012-09-04 | Tivo Inc. | Method and apparatus for advertisement placement in a user dialog on a set-top box |
US8775452B2 (en) | 2006-09-17 | 2014-07-08 | Nokia Corporation | Method, apparatus and computer program product for providing standard real world to virtual world links |
US8813118B2 (en) * | 2006-10-03 | 2014-08-19 | Verizon Patent And Licensing Inc. | Interactive content for media content access systems and methods |
US20080089551A1 (en) | 2006-10-16 | 2008-04-17 | Ashley Heather | Interactive TV data track synchronization system and method |
US9218213B2 (en) * | 2006-10-31 | 2015-12-22 | International Business Machines Corporation | Dynamic placement of heterogeneous workloads |
WO2008055204A2 (en) * | 2006-10-31 | 2008-05-08 | Dotted Pair, Inc. | System and method for interacting with item catalogs |
US8294825B2 (en) | 2006-11-16 | 2012-10-23 | Sharp Kabushiki Kaisha | Image display device and image display method |
US8269746B2 (en) * | 2006-11-27 | 2012-09-18 | Microsoft Corporation | Communication with a touch screen |
KR101576943B1 (en) * | 2006-12-01 | 2015-12-15 | 에이치에스엔아이 엘엘씨 | Method and System for Improved Interactive Television Processing |
CA2571617A1 (en) * | 2006-12-15 | 2008-06-15 | Desktopbox Inc. | Simulcast internet media distribution system and method |
KR100818990B1 (en) * | 2006-12-28 | 2008-04-04 | 삼성전자주식회사 | Apparatus and method for transferring moving signal |
US20080184132A1 (en) | 2007-01-31 | 2008-07-31 | Zato Thomas J | Media content tagging |
US8181206B2 (en) * | 2007-02-28 | 2012-05-15 | Time Warner Cable Inc. | Personal content server apparatus and methods |
KR20080099592A (en) * | 2007-05-10 | 2008-11-13 | 엘지전자 주식회사 | Remote controlling unit and method for operating remotely |
US8290513B2 (en) * | 2007-06-28 | 2012-10-16 | Apple Inc. | Location-based services |
US20090006211A1 (en) | 2007-07-01 | 2009-01-01 | Decisionmark Corp. | Network Content And Advertisement Distribution System and Method |
JPWO2009014206A1 (en) * | 2007-07-26 | 2010-10-07 | シャープ株式会社 | Remote control device and television broadcast receiver |
US8744118B2 (en) | 2007-08-03 | 2014-06-03 | At&T Intellectual Property I, L.P. | Methods, systems, and products for indexing scenes in digital media |
KR101348346B1 (en) * | 2007-09-06 | 2014-01-08 | 삼성전자주식회사 | Pointing apparatus, pointer controlling apparatus, pointing method and pointer controlling method |
KR101132592B1 (en) | 2007-09-14 | 2012-04-06 | 엔이씨 유럽 리미티드 | Method and system for optimizing network performances |
US8140012B1 (en) * | 2007-10-25 | 2012-03-20 | At&T Mobility Ii Llc | Bluetooth security profile |
US8875212B2 (en) | 2008-04-15 | 2014-10-28 | Shlomo Selim Rakib | Systems and methods for remote control of interactive video |
US8271357B2 (en) * | 2007-12-11 | 2012-09-18 | Ebay Inc. | Presenting items based on activity rates |
JP5228498B2 (en) * | 2008-01-22 | 2013-07-03 | 富士通株式会社 | retrieval method |
US20090235312A1 (en) * | 2008-03-11 | 2009-09-17 | Amir Morad | Targeted content with broadcast material |
WO2009120616A1 (en) * | 2008-03-25 | 2009-10-01 | Wms Gaming, Inc. | Generating casino floor maps |
US8549556B2 (en) | 2008-04-15 | 2013-10-01 | Shlomo Selim Rakib | Contextual advertising |
US20090256811A1 (en) * | 2008-04-15 | 2009-10-15 | Sony Ericsson Mobile Communications Ab | Optical touch screen |
US8760401B2 (en) | 2008-04-21 | 2014-06-24 | Ron Kimmel | System and method for user object selection in geographic relation to a video display |
US9256882B2 (en) | 2008-05-27 | 2016-02-09 | At&T Intellectual Property I, Lp. | Methods, communications devices, and computer program products for selecting an advertisement to initiate device-to-device communications |
EP2337353A4 (en) * | 2008-09-08 | 2012-04-04 | Sharp Kk | Control system, video display device, and remote control device |
US8239359B2 (en) * | 2008-09-23 | 2012-08-07 | Disney Enterprises, Inc. | System and method for visual search in a video media player |
KR100972932B1 (en) * | 2008-10-16 | 2010-07-28 | 인하대학교 산학협력단 | Touch Screen Panel |
US7756758B2 (en) * | 2008-12-08 | 2010-07-13 | Hsn Lp | Method and system for improved E-commerce shopping |
US20100257448A1 (en) | 2009-04-06 | 2010-10-07 | Interactical Llc | Object-Based Interactive Programming Device and Method |
EP2264580A2 (en) * | 2009-06-10 | 2010-12-22 | Samsung Electronics Co., Ltd. | Method and apparatus for processing motion data |
US9118468B2 (en) * | 2009-07-23 | 2015-08-25 | Qualcomm Incorporated | Asynchronous time division duplex operation in a wireless network |
US8272012B2 (en) * | 2009-07-29 | 2012-09-18 | Echostar Technologies L.L.C. | User-controlled data/video integration by a video control system |
US9232167B2 (en) | 2009-08-04 | 2016-01-05 | Echostar Technologies L.L.C. | Video system and remote control with touch interface for supplemental content display |
JP2013042196A (en) * | 2009-12-21 | 2013-02-28 | Panasonic Corp | Reproduction device |
-
2010
- 2010-05-05 US US12/774,221 patent/US20110063522A1/en not_active Abandoned
- 2010-05-05 US US12/774,154 patent/US9110517B2/en active Active
- 2010-05-05 US US12/774,380 patent/US8990854B2/en active Active
- 2010-05-05 US US12/774,321 patent/US8947350B2/en active Active
- 2010-08-05 US US12/850,911 patent/US9197941B2/en active Active
- 2010-08-05 US US12/851,036 patent/US9462345B2/en not_active Expired - Fee Related
- 2010-08-05 US US12/850,945 patent/US9081422B2/en active Active
- 2010-08-05 US US12/850,866 patent/US9098128B2/en active Active
- 2010-08-05 US US12/851,075 patent/US20110067069A1/en not_active Abandoned
- 2010-08-05 US US12/850,832 patent/US20110067047A1/en not_active Abandoned
- 2010-08-30 EP EP10009014.1A patent/EP2328347A3/en not_active Withdrawn
- 2010-09-13 US US12/880,888 patent/US8819732B2/en active Active
- 2010-09-13 US US12/881,110 patent/US9137577B2/en active Active
- 2010-09-13 US US12/881,031 patent/US20110066929A1/en not_active Abandoned
- 2010-09-13 US US12/880,668 patent/US8832747B2/en active Active
- 2010-09-13 US US12/881,067 patent/US9043833B2/en active Active
- 2010-09-13 US US12/881,004 patent/US8931015B2/en active Active
- 2010-09-13 US US12/880,594 patent/US8839307B2/en active Active
- 2010-09-13 US US12/881,096 patent/US9258617B2/en active Active
- 2010-09-13 US US12/880,749 patent/US9110518B2/en active Active
- 2010-09-13 US US12/880,530 patent/US20110067054A1/en not_active Abandoned
- 2010-09-13 US US12/880,851 patent/US20110067051A1/en not_active Abandoned
- 2010-09-13 US US12/880,965 patent/US9271044B2/en active Active
- 2010-09-14 TW TW99131055A patent/TW201132122A/en unknown
- 2010-09-14 CN CN2010102811775A patent/CN102025933A/en active Pending
-
2014
- 2014-08-12 US US14/457,451 patent/US20150012939A1/en not_active Abandoned
- 2014-08-25 US US14/467,408 patent/US20140366062A1/en not_active Abandoned
- 2014-09-08 US US14/479,670 patent/US20140380381A1/en not_active Abandoned
- 2014-09-08 US US14/480,020 patent/US20140380401A1/en not_active Abandoned
- 2014-09-17 US US14/488,778 patent/US20150007222A1/en not_active Abandoned
- 2014-12-17 US US14/572,916 patent/US20150106857A1/en not_active Abandoned
-
2015
- 2015-01-23 US US14/603,457 patent/US20150135217A1/en not_active Abandoned
- 2015-02-19 US US14/625,810 patent/US20150172769A1/en not_active Abandoned
- 2015-06-05 US US14/731,983 patent/US20150296263A1/en not_active Abandoned
- 2015-06-29 US US14/753,183 patent/US20150304721A1/en not_active Abandoned
- 2015-07-22 US US14/805,961 patent/US20150326931A1/en not_active Abandoned
- 2015-09-11 US US14/851,225 patent/US20160007090A1/en not_active Abandoned
Patent Citations (105)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5111511A (en) * | 1988-06-24 | 1992-05-05 | Matsushita Electric Industrial Co., Ltd. | Image motion vector detecting apparatus |
US5718845A (en) * | 1990-12-12 | 1998-02-17 | Enichem S.P.A. | Tricyanovinyl substitution process for NLO polymers |
US20060037044A1 (en) * | 1993-03-29 | 2006-02-16 | Microsoft Corporation | Pausing television programming in response to selection of hypertext link |
US5408258A (en) * | 1993-04-21 | 1995-04-18 | The Arbitron Company | Method of automatically qualifying a signal reproduction device for installation of monitoring equipment |
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
US5721584A (en) * | 1994-07-22 | 1998-02-24 | Sony Corporation | Two-way broadcast system and receiving system |
US5602568A (en) * | 1994-12-22 | 1997-02-11 | Goldstar Co., Ltd. | Point type remote control apparatus and the method thereof |
US5543851A (en) * | 1995-03-13 | 1996-08-06 | Chang; Wen F. | Method and apparatus for translating closed caption data |
US5727141A (en) * | 1995-05-05 | 1998-03-10 | Apple Computer, Inc. | Method and apparatus for identifying user-selectable regions within multiple display frames |
US20030115602A1 (en) * | 1995-06-07 | 2003-06-19 | Knee Robert Alan | Electronic television program guide schedule system and method with data feed access |
US20020090114A1 (en) * | 1995-07-27 | 2002-07-11 | Rhoads Geoffrey B. | Watermark enabled video objects |
US20020056136A1 (en) * | 1995-09-29 | 2002-05-09 | Wistendahl Douglass A. | System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box |
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US20030005445A1 (en) * | 1995-10-02 | 2003-01-02 | Schein Steven M. | Systems and methods for linking television viewers with advertisers and broadcasters |
US7409437B2 (en) * | 1996-03-08 | 2008-08-05 | Actv, Inc. | Enhanced video programming system and method for incorporating and displaying retrieved integrated Internet information segments |
US5929849A (en) * | 1996-05-02 | 1999-07-27 | Phoenix Technologies, Ltd. | Integration of dynamic universal resource locators with television presentations |
US6097441A (en) * | 1997-12-31 | 2000-08-01 | Eremote, Inc. | System for dual-display interaction with integrated television and internet content |
US6255961B1 (en) * | 1998-05-08 | 2001-07-03 | Sony Corporation | Two-way communications between a remote control unit and one or more devices in an audio/visual environment |
US20050028208A1 (en) * | 1998-07-17 | 2005-02-03 | United Video Properties, Inc. | Interactive television program guide with remote access |
US20120079525A1 (en) * | 1998-07-17 | 2012-03-29 | United Video Properties, Inc. | Interactive television program guide with remote access |
US7536706B1 (en) * | 1998-08-24 | 2009-05-19 | Sharp Laboratories Of America, Inc. | Information enhanced audio video encoding system |
US20120163776A1 (en) * | 1998-11-02 | 2012-06-28 | United Video Properties, Inc. | Interactive program guide with continuous data stream and client-server data supplementation |
US7158676B1 (en) * | 1999-02-01 | 2007-01-02 | Emuse Media Limited | Interactive system |
US6538672B1 (en) * | 1999-02-08 | 2003-03-25 | Koninklijke Philips Electronics N.V. | Method and apparatus for displaying an electronic program guide |
US6407779B1 (en) * | 1999-03-29 | 2002-06-18 | Zilog, Inc. | Method and apparatus for an intuitive universal remote control system |
US6349410B1 (en) * | 1999-08-04 | 2002-02-19 | Intel Corporation | Integrating broadcast television pause and web browsing |
US6931660B1 (en) * | 2000-01-28 | 2005-08-16 | Opentv, Inc. | Interactive television system and method for simultaneous transmission and rendering of multiple MPEG-encoded video streams |
US20070130581A1 (en) * | 2000-02-02 | 2007-06-07 | Del Sesto Eric E | Interactive content delivery methods and apparatus |
US20080066129A1 (en) * | 2000-02-29 | 2008-03-13 | Goldpocket Interactive, Inc. | Method and Apparatus for Interaction with Hyperlinks in a Television Broadcast |
US20080016526A1 (en) * | 2000-03-09 | 2008-01-17 | Asmussen Michael L | Advanced Set Top Terminal Having A Program Pause Feature With Voice-to-Text Conversion |
US20020040482A1 (en) * | 2000-04-08 | 2002-04-04 | Sextro Gary L. | Features for interactive television |
US20060195878A1 (en) * | 2000-04-12 | 2006-08-31 | Lg Electronics Inc. | Apparatus and method for providing and obtaining product information through a broadcast signal |
US7057670B2 (en) * | 2000-04-27 | 2006-06-06 | Dan Kikinis | Cursor control system |
US20080132163A1 (en) * | 2000-05-31 | 2008-06-05 | Optinetix (Israel) Ltd. | Systems and methods for distributing information through broadcast media |
US20020042925A1 (en) * | 2000-07-24 | 2002-04-11 | Koji Ebisu | Television receiver, receiver and program execution method |
US20020078446A1 (en) * | 2000-08-30 | 2002-06-20 | Jon Dakss | Method and apparatus for hyperlinking in a television broadcast |
US20020069405A1 (en) * | 2000-09-20 | 2002-06-06 | Chapin Paul W. | System and method for spokesperson interactive television advertisements |
US20020136432A1 (en) * | 2000-11-10 | 2002-09-26 | Hiroyuki Koike | Method and apparatus for processing information of an object |
US20090199259A1 (en) * | 2001-02-02 | 2009-08-06 | Rachad Alao | Service gateway for interactive television |
US20020120934A1 (en) * | 2001-02-28 | 2002-08-29 | Marc Abrahams | Interactive television browsing and buying method |
US7360232B2 (en) * | 2001-04-25 | 2008-04-15 | Diego, Inc. | System and method to subscribe to channel URL addresses and to provide non-programming-related URL addresses in an interactive video casting system |
US20040167855A1 (en) * | 2001-04-26 | 2004-08-26 | Cambridge Vivien Johan | Automatic billing system for remote internet services |
US20030023981A1 (en) * | 2001-07-25 | 2003-01-30 | Thomas Lemmons | Method and apparatus for transmission of interactive and enhanced television data |
US20030051253A1 (en) * | 2001-08-16 | 2003-03-13 | Barone Samuel T. | Interactive television tracking system |
US20030035075A1 (en) * | 2001-08-20 | 2003-02-20 | Butler Michelle A. | Method and system for providing improved user input capability for interactive television |
US20030054878A1 (en) * | 2001-09-20 | 2003-03-20 | International Game Technology | Point of play registration on a gaming machine |
US20070097275A1 (en) * | 2001-09-27 | 2007-05-03 | Universal Electronics Inc. | Two way communication using light links |
US20030145326A1 (en) * | 2002-01-31 | 2003-07-31 | Koninklijke Philips Electronics N.V. | Subscription to TV channels/shows based on recommendation generated by a TV recommender |
US20040078814A1 (en) * | 2002-03-29 | 2004-04-22 | Digeo, Inc. | Module-based interactive television ticker |
US20090021473A1 (en) * | 2002-12-08 | 2009-01-22 | Grant Danny A | Haptic Communication Devices |
US20040119701A1 (en) * | 2002-12-19 | 2004-06-24 | Mulligan Roger C. | Lattice touch-sensing system |
US20050086690A1 (en) * | 2003-10-16 | 2005-04-21 | International Business Machines Corporation | Interactive, non-intrusive television advertising |
US20050132420A1 (en) * | 2003-12-11 | 2005-06-16 | Quadrock Communications, Inc | System and method for interaction with television content |
US20050137958A1 (en) * | 2003-12-23 | 2005-06-23 | Thomas Huber | Advertising methods for advertising time slots and embedded objects |
US20050153687A1 (en) * | 2004-01-13 | 2005-07-14 | Nokia Corporation | Providing location information |
US7535456B2 (en) * | 2004-04-30 | 2009-05-19 | Hillcrest Laboratories, Inc. | Methods and devices for removing unintentional movement in 3D pointing devices |
US20080066097A1 (en) * | 2004-10-13 | 2008-03-13 | Woodhyun Park | Method Of Realizing Interactive Advertisement Under Digital Braodcasting Environment By Extending Program Associated Data-Broadcasting To Internet Area |
US20060099964A1 (en) * | 2004-11-05 | 2006-05-11 | Ebay Inc. | System and method for location based content correlation |
US20060174273A1 (en) * | 2004-11-20 | 2006-08-03 | Samsung Electronics Co., Ltd. | Method of displaying service in DMB, and method and apparatus for managing preferred service |
US7864159B2 (en) * | 2005-01-12 | 2011-01-04 | Thinkoptics, Inc. | Handheld vision based absolute pointing system |
US20060152489A1 (en) * | 2005-01-12 | 2006-07-13 | John Sweetser | Handheld vision based absolute pointing system |
US8223136B2 (en) * | 2005-06-07 | 2012-07-17 | Intel Corporation | Error detection and prevention inacoustic data |
US7344084B2 (en) * | 2005-09-19 | 2008-03-18 | Sony Corporation | Portable video programs |
US20070137611A1 (en) * | 2005-12-21 | 2007-06-21 | Yu Robert C | Active radical initiator for internal combustion engines |
US20110179435A1 (en) * | 2005-12-29 | 2011-07-21 | Charles Cordray | Systems and methods for managing content |
US20070157260A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Interactive media guidance system having multiple devices |
US20070156521A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Systems and methods for commerce in media program related merchandise |
US20070195205A1 (en) * | 2006-02-21 | 2007-08-23 | Lowe Jerry B | Remote control system and method |
US20070199014A1 (en) * | 2006-02-22 | 2007-08-23 | E-Cast, Inc. | Consumer portal |
US20100064320A1 (en) * | 2006-03-13 | 2010-03-11 | Verizon Services Corp. | Integrating data on program popularity into an on-screen program guide |
US8095423B2 (en) * | 2006-03-17 | 2012-01-10 | Grant Allen Lee Nichols | Interactive international bulk trade television |
US20080052750A1 (en) * | 2006-08-28 | 2008-02-28 | Anders Grunnet-Jepsen | Direct-point on-demand information exchanges |
US8421746B2 (en) * | 2006-09-07 | 2013-04-16 | Porto Vinci Ltd. Limited Liability Company | Device control using multi-dimensional motion sensing and a wireless home entertainment hub |
US20080109851A1 (en) * | 2006-10-23 | 2008-05-08 | Ashley Heather | Method and system for providing interactive video |
US20080136754A1 (en) * | 2006-12-06 | 2008-06-12 | Sony Corporation | Display apparatus, display-apparatus control method and program |
US20080209480A1 (en) * | 2006-12-20 | 2008-08-28 | Eide Kurt S | Method for enhanced video programming system for integrating internet data for on-demand interactive retrieval |
US20080172693A1 (en) * | 2007-01-16 | 2008-07-17 | Microsoft Corporation | Representing Television Programs Using Video Objects |
US20080172252A1 (en) * | 2007-01-17 | 2008-07-17 | Mitochon Systems, Inc | Apparatus and Method for Revenue Distribution Generated From Delivering Healthcare Advertisements Via EMR Systems, RHIN, and Electronic Advertising Servers |
US20080177570A1 (en) * | 2007-01-18 | 2008-07-24 | Ari Craine | Methods, Systems, and Computer-Readable Media for Disease Management |
US20080204603A1 (en) * | 2007-02-27 | 2008-08-28 | Hideharu Hattori | Video displaying apparatus and video displaying method |
US20080204605A1 (en) * | 2007-02-28 | 2008-08-28 | Leonard Tsai | Systems and methods for using a remote control unit to sense television characteristics |
US7890380B2 (en) * | 2007-05-07 | 2011-02-15 | At&T Intellectual Property I, L.P. | Method, system, and computer readable medium for implementing sales of products using a trace of an object |
US20120154268A1 (en) * | 2007-05-14 | 2012-06-21 | Apple Inc. | Remote control systems that can distinguish stray light sources |
US7889175B2 (en) * | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
US20090037947A1 (en) * | 2007-07-30 | 2009-02-05 | Yahoo! Inc. | Textual and visual interactive advertisements in videos |
US20090113475A1 (en) * | 2007-08-21 | 2009-04-30 | Yi Li | Systems and methods for integrating search capability in interactive video |
US7987478B2 (en) * | 2007-08-28 | 2011-07-26 | Sony Ericsson Mobile Communications Ab | Methods, devices, and computer program products for providing unobtrusive video advertising content |
US20090077394A1 (en) * | 2007-09-17 | 2009-03-19 | Jr-Shian Tsai | Techniques for communications based power management |
US20090083815A1 (en) * | 2007-09-19 | 2009-03-26 | Mcmaster Orlando | Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time |
US8359628B2 (en) * | 2007-10-05 | 2013-01-22 | Sony Corporation | Display device and transmitting device |
US20090165048A1 (en) * | 2007-12-19 | 2009-06-25 | United Video Properties, Inc. | Methods and devices for presenting guide listings and guidance data in three dimensions in an interactive media guidance application |
US20090165041A1 (en) * | 2007-12-21 | 2009-06-25 | Penberthy John S | System and Method for Providing Interactive Content with Video Content |
US20090187862A1 (en) * | 2008-01-22 | 2009-07-23 | Sony Corporation | Method and apparatus for the intuitive browsing of content |
US20090217317A1 (en) * | 2008-02-26 | 2009-08-27 | At&T Knowledge Ventures, L.P. | System and method for promoting marketable items |
US20100098074A1 (en) * | 2008-10-22 | 2010-04-22 | Backchannelmedia Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US8181212B2 (en) * | 2008-10-30 | 2012-05-15 | Frederic Sigal | Method of providing a frame-based object redirection overlay for a video stream |
US20100157152A1 (en) * | 2008-12-18 | 2010-06-24 | Thomson Licensing | Display device with feedback elements and method for monitoring |
US20100162303A1 (en) * | 2008-12-23 | 2010-06-24 | Cassanova Jeffrey P | System and method for selecting an object in a video data stream |
US20100218228A1 (en) * | 2009-02-20 | 2010-08-26 | Walter Edward A | System and method for processing image objects in video data |
US20110067063A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for presenting information associated with a user-selected object in a televison program |
US20110063523A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television controller for providing user-selection of objects in a television program |
US20110067062A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for providing information of selectable objects in a television program |
US20110067064A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for presenting information associated with a user-selected object in a television program |
US20110067069A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a parallel television system for providing for user-selection of an object in a television program |
US20110141013A1 (en) * | 2009-12-14 | 2011-06-16 | Alcatel-Lucent Usa, Incorporated | User-interface apparatus and method for user control |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8819732B2 (en) | 2009-09-14 | 2014-08-26 | Broadcom Corporation | System and method in a television system for providing information associated with a user-selected person in a television program |
US20110063511A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television controller for providing user-selection of objects in a television program |
US9462345B2 (en) | 2009-09-14 | 2016-10-04 | Broadcom Corporation | System and method in a television system for providing for user-selection of an object in a television program |
US20110067065A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for providing information associated with a user-selected information elelment in a television program |
US20110063523A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television controller for providing user-selection of objects in a television program |
US20110067057A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US20110067055A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for providing information associated with a user-selected person in a television program |
US8832747B2 (en) | 2009-09-14 | 2014-09-09 | Broadcom Corporation | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US20110067062A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for providing information of selectable objects in a television program |
US20110067052A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US20110067069A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a parallel television system for providing for user-selection of an object in a television program |
US20110067063A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for presenting information associated with a user-selected object in a televison program |
US20110067064A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for presenting information associated with a user-selected object in a television program |
US20110063206A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for generating screen pointing information in a television control device |
US20110067051A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for providing advertising information associated with a user-selected object in a television program |
US20110067071A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US20110067061A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television system for providing for user-selection of an object in a television program |
US20110067047A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a distributed system for providing user-selection of objects in a television program |
US20110067054A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a distributed system for responding to user-selection of an object in a television program |
US20110067056A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a local television system for responding to user-selection of an object in a television program |
US20110063509A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method in a television receiver for providing user-selection of objects in a television program |
US20110063522A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for generating television screen pointing information using an external receiver |
US20110063521A1 (en) * | 2009-09-14 | 2011-03-17 | Jeyhan Karaoguz | System and method for generating screen pointing information in a television |
US8931015B2 (en) | 2009-09-14 | 2015-01-06 | Broadcom Corporation | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US8947350B2 (en) | 2009-09-14 | 2015-02-03 | Broadcom Corporation | System and method for generating screen pointing information in a television control device |
US8990854B2 (en) | 2009-09-14 | 2015-03-24 | Broadcom Corporation | System and method in a television for providing user-selection of objects in a television program |
US9043833B2 (en) | 2009-09-14 | 2015-05-26 | Broadcom Corporation | System and method in a television system for presenting information associated with a user-selected object in a television program |
US9081422B2 (en) | 2009-09-14 | 2015-07-14 | Broadcom Corporation | System and method in a television controller for providing user-selection of objects in a television program |
US9098128B2 (en) | 2009-09-14 | 2015-08-04 | Broadcom Corporation | System and method in a television receiver for providing user-selection of objects in a television program |
US9110517B2 (en) | 2009-09-14 | 2015-08-18 | Broadcom Corporation | System and method for generating screen pointing information in a television |
US9110518B2 (en) | 2009-09-14 | 2015-08-18 | Broadcom Corporation | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US9137577B2 (en) | 2009-09-14 | 2015-09-15 | Broadcom Coporation | System and method of a television for providing information associated with a user-selected information element in a television program |
US9197941B2 (en) | 2009-09-14 | 2015-11-24 | Broadcom Corporation | System and method in a television controller for providing user-selection of objects in a television program |
US9271044B2 (en) * | 2009-09-14 | 2016-02-23 | Broadcom Corporation | System and method for providing information of selectable objects in a television program |
US9258617B2 (en) | 2009-09-14 | 2016-02-09 | Broadcom Corporation | System and method in a television system for presenting information associated with a user-selected object in a television program |
US20160029094A1 (en) * | 2013-04-22 | 2016-01-28 | LiveRelay Inc. | Enabling interaction between social network users during synchronous display of video channgel |
US20140317660A1 (en) * | 2013-04-22 | 2014-10-23 | LiveRelay Inc. | Enabling interaction between social network users during synchronous display of video channel |
US10104156B2 (en) * | 2014-06-10 | 2018-10-16 | Fuji Xerox Co., Ltd. | Object image information management server, recording medium, and object image information management method |
US20160070892A1 (en) * | 2014-08-07 | 2016-03-10 | Click Evidence, Inc. | System and method for creating, processing, and distributing images that serve as portals enabling communication with persons who have interacted with the images |
US9928352B2 (en) * | 2014-08-07 | 2018-03-27 | Tautachrome, Inc. | System and method for creating, processing, and distributing images that serve as portals enabling communication with persons who have interacted with the images |
US10339283B2 (en) * | 2014-08-07 | 2019-07-02 | Tautachrome, Inc. | System and method for creating, processing, and distributing images that serve as portals enabling communication with persons who have interacted with the images |
US10649666B1 (en) * | 2017-05-10 | 2020-05-12 | Ambarella International Lp | Link-list shortening logic |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8931015B2 (en) | System and method for providing information of selectable objects in a television program in an information stream independent of the television program | |
US9129641B2 (en) | Method and system for media selection and sharing | |
US20110162007A1 (en) | Television system providing user-interaction related to an in-progress television program | |
US20130259447A1 (en) | Method and apparatus for user directed video editing | |
US20100289900A1 (en) | Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences | |
US20100158391A1 (en) | Identification and transfer of a media object segment from one communications network to another | |
US20130259446A1 (en) | Method and apparatus for user directed video editing | |
CN108605153A (en) | Synchronized multimedia content tab data | |
US20110178880A1 (en) | System and method for monitoring and reporting presentation of recorded advertising content | |
KR101533836B1 (en) | Purchasing advertisement object method based on creating time-table using purview cursor for logotional advertisement | |
KR101664000B1 (en) | Purchasing advertisement object system based on creating time-table using purview cursor for logotional advertisement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KARAOGUZ, JEYHAN;SESHARDI, NAMBIRAJAN;SIGNING DATES FROM 20100910 TO 20100913;REEL/FRAME:025094/0017 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |