US20150052540A1 - Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System - Google Patents

Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System Download PDF

Info

Publication number
US20150052540A1
US20150052540A1 US14/531,053 US201414531053A US2015052540A1 US 20150052540 A1 US20150052540 A1 US 20150052540A1 US 201414531053 A US201414531053 A US 201414531053A US 2015052540 A1 US2015052540 A1 US 2015052540A1
Authority
US
United States
Prior art keywords
user reaction
user
data
icon
reaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/531,053
Inventor
Jin Soo Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Excalibur IP LLC
Altaba Inc
Original Assignee
Yahoo Inc until 2017
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yahoo Inc until 2017 filed Critical Yahoo Inc until 2017
Priority to US14/531,053 priority Critical patent/US20150052540A1/en
Publication of US20150052540A1 publication Critical patent/US20150052540A1/en
Assigned to EXCALIBUR IP, LLC reassignment EXCALIBUR IP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO! INC.
Assigned to YAHOO! INC. reassignment YAHOO! INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EXCALIBUR IP, LLC
Assigned to EXCALIBUR IP, LLC reassignment EXCALIBUR IP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO! INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/08Digital computers in general; Data processing equipment in general using a plugboard for programming
    • G06F15/10Tabulators
    • G06F15/12Tabulators having provision for both printed and punched output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/252Processing of multiple end-users' preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4758End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for providing answers, e.g. voting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests

Definitions

  • the present invention generally relates to providing virtual co-presence to audiences of a program broadcast in a communication network.
  • the Internet is used nowadays as a new medium for broadcasting various types of entertainment programs (e.g., sports or reality show programs) to Internet users.
  • entertainment programs e.g., sports or reality show programs
  • Many users log on to web sites providing such entertainment programs and view a program to their liking.
  • popular sports matches e.g., U.S. Super Bowl or FIFA Word Cup
  • thousands of users may watch the same game through the same website broadcasting the match.
  • These viewers all share the excitement of the live broadcast simultaneously and may express their impression of the game through bulletin postings or other means after the game has ended.
  • a method and apparatus for providing “virtual co-presence” to the viewers watching a broadcast program through a communication network by relaying their reactions to the broadcast program in real-time.
  • the online broadcasting system may comprise a plurality of user terminals, a user reaction processing server and a broadcast server configured to transmit broadcast program data to the user terminals.
  • the user terminals may be configured to: display a broadcast program based on the broadcast program data; receive individual user reaction data from a user input interface; and transmit the received individual user reaction data to the broadcast server or the user reaction processing server.
  • the broadcast server or the user reaction processing server may be configured to: receive the individual user reaction data from each of the user terminals; aggregate the individual user reaction data to create total user reaction data based on the received individual user reaction data; generate interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and transmit the generated interface control data to at least one of the user terminals.
  • the individual user reaction data may include a plurality of individual user reaction fields.
  • the broadcast program may be displayed with a plurality of user reaction icons, wherein each of the user reaction icons corresponds to one of the individual user reaction fields.
  • the user terminals may receive the individual user reaction data indicating at least one of the user reaction icons from viewers watching the broadcast program.
  • the user terminals may receive the generated interface control data and generate at least one of the visual, auditory and tactual outputs based on the received interface control data
  • Said total user reaction data may include a plurality of total user reaction fields, wherein each of the total user reaction fields corresponds to one of the individual user reaction fields.
  • the total user reaction data may be generated by generating each of the total user reaction fields based on the corresponding individual user reaction fields of the received individual user reaction data.
  • the total user reaction fields may be generated by calculating a total user reaction number for each of the user reaction fields.
  • FIG. 1 illustrates a schematic diagram of an online broadcasting system according to one embodiment of the present invention.
  • FIG. 2 illustrates a schematic diagram of an online broadcasting system according to another embodiment of the present invention.
  • FIG. 3 illustrates a flow chart of a method for processing user reaction data in a user terminal according to one embodiment of the present invention.
  • FIG. 4 illustrates a flow chart of a method for generating total reaction data based on individual reaction data received from the user terminals in the broadcast server according to one embodiment of the present invention.
  • FIG. 5 illustrates an example computing system architecture, which may be used to implement embodiments of the present invention.
  • FIG. 6 illustrates example web pages showing broadcast program and audience reactions thereto.
  • FIG. 1 illustrates a schematic diagram of an online broadcasting system according to one embodiment of the present invention.
  • the online broadcasting system 100 includes a communication network 110 such as an internet, a plurality of user terminals (UT) 121 , 122 and 123 connected to the communication network 110 and a broadcast server 130 broadcasting a program to the user terminals 121 , 122 and 123 through the communication network 110 .
  • the user terminals 121 , 122 and 123 may be personal computers, notebooks or any other terminals with appropriate communication means to provide access to online broadcast services, such as online sports broadcast service provided by the broadcast server 130 , to its users.
  • FIG. 1 illustrates a schematic diagram of an online broadcasting system according to one embodiment of the present invention.
  • the online broadcasting system 100 includes a communication network 110 such as an internet, a plurality of user terminals (UT) 121 , 122 and 123 connected to the communication network 110 and a broadcast server 130 broadcasting a program to the user terminals 121 , 122 and 123
  • the first, second and third users may respectively log on to the broadcast server 130 through the first, second and third user terminals 121 , 122 and 123 to watch the program broadcast from the broadcast server 130 through the communication network 110 .
  • each of the user terminals 121 , 122 and 123 may receive from the broadcast server 130 the user interface data for displaying a broadcast program display screen.
  • the broadcast program display screen may be prepared by a client application each running on the user terminals 121 , 122 and 123 .
  • An example of the broadcast program display screen is shown in FIG. 6 . Referring to FIG.
  • the broadcast program display screen such as broadcast program display screen 600
  • the broadcast program display screen 600 may include an audience reaction portion 620 for providing the reaction of the users watching the program displayed on the broadcast program display portion 610 .
  • the audience reaction portion 620 may include a user reaction input section 621 for generating individual user reaction data, which indicates the reaction of the corresponding user to the program currently being displayed on the broadcast program display portion.
  • the user reaction input section 621 may include a plurality of user reaction icons, each indicating a predetermined reaction the users may express in response to the broadcast program.
  • the user reaction icons may include a hooting icon, a cheering icon, an applauding icon and any other icons that indicate a specific mood or action the users may have in response to the broadcast program.
  • the corresponding user terminal 121 , 122 or 123 when one of the users selects one of the user reaction icons by clicking or placing a cursor on one of the user reaction icons displayed on the screen of the corresponding user terminal 121 , 122 or 123 through a user input interface (e.g., mouse, keyboard, etc.), the corresponding user terminal 121 , 122 or 123 generates an individual user reaction data based on such user input.
  • the individual user reaction data may include a plurality of individual user reaction fields, wherein each field corresponds to one of the user reaction icons and includes a flag indicating whether the corresponding user reaction icon has been selected by the user. The flag may be set to “1” when the corresponding user reaction icon has been selected by the user.
  • the flag may be set to “0” when the corresponding user reaction icon has not been selected by the user.
  • the individual user reaction data may be periodically collected for a time interval of a predetermined length and put together as individual user reaction data representative of the time interval
  • the corresponding user terminal 121 , 122 or 123 may transmit for the time interval the generated individual user reaction data to the broadcast server 130 .
  • the broadcast server 130 may receive the individual user reaction data from the user terminals 121 , 122 and 123 and aggregate the received individual data to create the total user reaction data based on the received individual user reaction data.
  • the total user reaction data reflects the reactions from all of the users in response to the program broadcast by the broadcast server 130 for a predetermined time interval.
  • the total user reaction data may contain a plurality of total user reaction fields each corresponding to one of the user reaction fields of the individual user reaction data.
  • the broadcast server 130 may generate the total user reaction data by calculating the total user reaction number for each of the user reaction fields.
  • the user reaction number may be obtained by calculating the number of the individual user reaction data with the flag of the corresponding user reaction field set to “1.” For example, if the individual user reaction data contains the “hooting,” “cheering” and “applauding” fields (e.g., an ordered-tuple such as “hooting,” “cheering” and “applauding”) and the broadcast server 130 receives from each of the user terminals 121 , 122 and 123 the individual user reaction data of (1,0,0), (1,0,0) and (0,1,0), then the broadcast server 130 may add the individual user reaction data of (1,0,0), (1,0,0) and (0,1,0) to calculate the total user reaction data as (2,1,0).
  • the broadcast server 130 may add the individual user reaction data of (1,0,0), (1,0,0) and (0,1,0) to calculate the total user reaction data as (2,1,0).
  • the broadcast server 130 may generate for the predetermined time interval the total user reaction data to the user terminals 121 , 122 and 123 . Further, the broadcast server 130 may generate and transmit interface control data based on the total user reaction data to the user terminals 121 , 122 and 123 through the communication network 110 .
  • the interface control data when executed, is operative to cause the client application running on each of the user terminals 121 , 122 and 123 to generate an output indicative of the total user reaction data.
  • the format and structure of the interface control data may depend on the type of the client application and/or the operating system running on the user terminals 121 , 122 and 123 .
  • Each of the user terminals 121 , 122 and 123 receives the interface control data and generates visual, auditory and/or tactual output based on the received interface control data.
  • the user terminals 121 , 122 and 123 may have monitors, speakers and vibrators for the visual, auditory and tactual outputs, respectively.
  • Other means may be employed for the above sensory output. Those skilled in the art would have no difficulty in selecting the appropriate means for such purpose.
  • the user terminals 121 , 122 and 123 may output sounds (e.g., hooting, cheering and applauding sounds) corresponding to the received interface control data through the speakers connected thereto.
  • sounds e.g., hooting, cheering and applauding sounds
  • a hooting sound and a cheering sound may be outputted through the speaker.
  • the intensity of each of the sound pertaining to one of the user reaction fields may be controlled in accordance with the number indicated by the corresponding field.
  • the hooting sound may be outputted with twice the intensity of that of the cheering sound. This configuration accurately reflects the collective reaction of all the users watching the broadcast program.
  • the user terminals 121 , 122 and 123 may control the vibration of the vibrators connected thereto in accordance with the received interface control data. For example, the intensity and frequency of the vibration may be controlled according to the interface control data indicative of the total user reaction.
  • Such vibrators may be used to provide virtual co-presence of other users watching the game, especially a person with visual or hearing impairments. Also, the vibrators may be used in conjunction with other devices to provide a richer experience to the users.
  • the user terminals 121 , 122 and 123 may provide visual output through the monitors in accordance with the received interface control data.
  • the audience reaction portion 620 of the broadcast program display screen 600 may include a user reaction output section 622 for providing the visual output.
  • the reaction output section 622 may include one avatar, which displays certain movements in accordance with the received interface control data.
  • the avatar may display a hooting, cheering or applauding movement. For example, if the interface control data corresponding to the total user reaction data (2,1,0) is received, then the avatar may display the hooting movement since the hooting field has the biggest number.
  • the reaction output section 622 may include a plurality of avatars, wherein each of the avatars displays one of the predetermined movements in accordance with the received interface control data
  • each of the avatars may display one of the movements corresponding to the user reaction fields with its flag set to “1” in the received total user reaction data.
  • two (among the plurality of avatars) may display the “hooting” and “cheering” movements.
  • Various methods other than the avatars may be used to visually convey the reaction of all the users watching the program broadcast by the broadcast server 130 .
  • All of the above data and signaling messages required to transmit the data may be implemented by using known communication network protocols such as HTTP (Hyper Text Transfer Protocol).
  • HTTP Hyper Text Transfer Protocol
  • Each of the broadcast server 130 and user terminals 121 , 122 and 123 may include a communication module and a control module to perform the aforementioned functions.
  • the user terminals 121 , 122 and 123 may further include a user interface to receive the user inputs from the users.
  • the communication module may be configured to communicate with other network entities connected to the communication network 110 .
  • the broadcast program data, individual user reaction data, total user reaction data and/or interface control data may be communicated to other network entities through the communication network 110 by the communication module.
  • the control module of the broadcast server 130 may be configured to generate the total user reaction data based on the individual user reaction data received from the user terminals 121 , 122 and 123 .
  • control module of the broadcast server 130 may be configured to generate the interface control data based on the total user reaction data.
  • the control module of the user terminals 121 , 122 and 123 may be configured to: receive the broadcast program data from the broadcast server through the communication module; display a broadcast program based on the broadcast program data; in response to the displayed broadcast program, receive individual user reaction data from the user interface; transmit the received individual user reaction data to a user reaction processing server or the broadcast server through the communication module; and in response to the transmitted individual user reaction data, receive interface control data from the user reaction processing server or the broadcast server.
  • FIG. 2 illustrates a schematic diagram of an online broadcasting system according to another embodiment of the present invention.
  • the online broadcasting system 200 in FIG. 2 has a similar configuration to the online broadcasting system 100 shown in FIG. 1 .
  • each component of the online broadcasting system 200 which has substantially the same function as the counterpart shown in FIG. 1 , is identified by the same reference numeral and the description thereof will be omitted herein. Further, the features different from the online broadcasting system 100 of FIG. 1 will be explained in detail in the ensuing descriptions.
  • the online broadcasting system 200 further includes a user reaction processing server 250 .
  • the user terminals 121 , 122 and 123 may receive a program broadcast from a broadcast server 130 and transmit individual user reaction data to the user reaction processing server 250 .
  • the user reaction processing server 250 may receive the individual user reaction data from each of the user terminals 121 , 122 and 123 and generate total user reaction data based on the received individual user reaction data.
  • the user reaction processing server 250 may employ various methods including those described with respect to FIG. 1 . Further, the user reaction processing server 250 may generate interface control data based on the generated total user reaction data.
  • the user reaction processing server 250 transmits the generated interface control data to the user terminals 121 , 122 and 123 .
  • the online broadcasting system relieves the traffic load of the broadcast server 130 by providing the total user reaction data and the interface control data through a separate server, i.e., the user reaction processing server 250 .
  • FIG. 3 illustrates a flow chart of a method for processing user reaction data in a user terminal according to one embodiment of the present invention.
  • a user terminal such as user terminals 121 , 122 and 123 ( FIGS. 1 and 2 ) receives a broadcast program data from a broadcast server such as the broadcast server 130 (operation 300 ).
  • the broadcast program data may be sound, video or any other type of multimedia data.
  • the broadcast program data may be relayed using RTP (Real-time Transport Protocol)/RTCP (RTP Control Protocol) or any other appropriate communication network protocols.
  • RTP Real-time Transport Protocol
  • RTCP RTP Control Protocol
  • the user terminal displays the program contained in the broadcast program data.
  • the user terminal receives from user interface, such as a keyboard or a mouse, individual user reaction data of a user currently watching the program through the user terminal and transmits the received individual user reaction data to the broadcast server or a user reaction processing server such as the user reaction processing server 250 ( FIG. 2 ) (operation 330 ). Thereafter, the user terminal receives the interface control data from the broadcast server or the user reaction processing server (operation 340 ) and outputs visual, auditory and/or tactual output based on the received interface control data (operation 350 ).
  • user interface such as a keyboard or a mouse
  • FIG. 4 illustrates a flow chart of a method for generating total reaction data based on individual reaction data received from the user terminals in the broadcast server according to one embodiment of the present invention.
  • a broadcast server such as broadcast server 132 ( FIG. 1 ) broadcasts a program by transmitting broadcast program data to a plurality of user terminals such as the user terminals 121 , 122 and 123 ( FIG. 1 ) (operation 400 ).
  • the broadcast server receives the individual user reaction data from the user terminals and aggregates the received user reaction data to create the total user reaction data (operation 420 ).
  • the broadcast server generates interface control data based on the generated total user reaction data.
  • the broadcast server transmits the generated interface control data to the plurality of user terminals.
  • FIG. 5 illustrates an example computing system architecture, which may be used to implement the above-described embodiments, which may be used to perform one or more of the processes or elements described herein.
  • the hardware system 500 includes a processor 502 , a cache memory 504 and one or more software applications and drivers directed to the functions described herein.
  • the hardware system 500 includes a high performance input/output (I/O) bus 506 and a standard I/O bus 508 .
  • a host bridge 510 couples the processor 502 to the high performance I/O bus 506
  • the I/O bus bridge 512 couples the two buses 506 and 508 with each other.
  • a system memory 514 and a network/communication interface 516 couple to the bus 506 .
  • the hardware system 500 may further include a video memory (not shown) and a display device coupled to the video memory.
  • a mass storage 518 and I/O ports 520 are coupled to the bus 508 .
  • the hardware system 500 may optionally include a keyboard and pointing device, as well as a display device (not shown) coupled to the bus 508 .
  • the network interface 516 provides communication between the hardware system 500 and any one of a wide range of networks such as an Ethernet (e.g., IEEE 802.3) network, etc.
  • the network interface 516 interfaces between the hardware system 500 and the network connected to the user terminals 121 , 122 and 123 for allowing the hardware system 500 to communicate with those terminals.
  • the user terminals 121 , 122 and 123 FIGS.
  • the network interface 516 interfaces between the hardware system 500 and the network connected to the broadcast server 130 and/or the user reaction processing server 250 for allowing the hardware system 500 to communicate with those servers.
  • the mass storage 518 provides permanent storage for the data and programming instructions to perform the above-described functions implemented in the user terminals 121 , 122 and 123 , the broadcast server 130 or the user reaction processing server 250 , whereas the system memory 514 (e.g., DRAM) provides temporary storage for the data and programming instructions when executed by the processor 502 .
  • the I/O ports 520 are one or more serial and/or parallel communication ports that provide communication between additional peripheral devices, which may be coupled to the hardware system 500 .
  • the hardware system 500 may include a variety of system architectures. Further, various components of the hardware system 500 may be rearranged. For example, the cache 504 may be on-chip with the processor 502 . Alternatively, the cache 504 and the processor 502 may be packed together as a “processor module,” with the processor 502 being referred to as the “processor core.” Furthermore, certain implementations of the present invention may not require or include all of the above components. For example, the peripheral devices shown coupled to the standard I/O bus 508 may be coupled to the high performance I/O bus 506 . In addition, in some implementations, only a single bus may exist, with the components of the hardware system 500 being coupled to the single bus. Furthermore, the hardware system 500 may include additional components such as additional processors, storage devices or memories.
  • the operations of the online broadcasting system including the online broadcasting systems 100 and 200 described herein are implemented as a series of software routines run by the hardware system 500 .
  • These software routines comprise a plurality or series of instructions to be executed by a processor in the hardware system such as the processor 502 .
  • the series of instructions are stored in a storage device such as the mass storage 518 .
  • the series of instructions can be stored in any suitable storage medium such as a diskette, CD-ROM, ROM, EEPROM, etc.
  • the series of instructions need not be stored locally and could be received from a remote storage device, such as a server on a network, via the network/communication interface 516 .
  • the instructions are copied from the storage device, such as the mass storage 518 , into the memory 514 and then accessed and executed by the processor 502 .
  • An operating system manages and controls the operation of the hardware system 500 , including the input and output of data to and from software applications (not shown).
  • the operating system provides an interface between the software applications being executed on the system and the hardware components of the system.
  • the operating system is the Windows® 95/98/NT/XP/Vista operating system, which is available from Microsoft Corporation of Redmond, Wash.
  • the present invention may be used with other suitable operating systems, such as the Apple Macintosh Operating System available from Apple Computer Inc. of Cupertino, Calif., UNIX operating systems, LINUX operating systems and the like.
  • the present invention provides a method and apparatus for “virtual co-presence” to the viewers watching a broadcast program through a communication network by relaying their reactions to the broadcast program in real-time.

Abstract

There is provided an online broadcasting system for providing virtual co-presence to broadcast audiences. The online broadcasting system may comprise a plurality of user terminals, a user reaction processing server and a broadcast server configured to transmit broadcast program data to the user terminals. The user terminals may be configured to: display a broadcast program based on the broadcast program data; receive individual user reaction data from a user input interface; and transmit the received individual user reaction data to the broadcast server. The user reaction processing server may be configured to: receive the individual user reaction data from each of the user terminals; aggregate the individual user reaction data to create total user reaction data based on the received individual user reaction data; generate interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and transmit the generated interface control data to at least one of the user terminals.

Description

    TECHNICAL FIELD
  • The present invention generally relates to providing virtual co-presence to audiences of a program broadcast in a communication network.
  • BACKGROUND
  • As the Internet has become highly integrated into everyday life, Internet websites have emerged as an attractive new medium for providing various entertainment content, such as online games. Especially, as the number of people using such content has grown, various methods were devised to provide the so-called “virtual co-presence” to users using the same contents. The “virtual co-presence” refers to the psychological state of acquiring a feeling that one is at the same place and interacting with other people located in different geographical locations and using the same content at the same time. In the area of online games, visual means such as avatars have been used to provide such a sense of “co-presence” to the users participating in the game.
  • Meanwhile, the Internet is used nowadays as a new medium for broadcasting various types of entertainment programs (e.g., sports or reality show programs) to Internet users. Many users log on to web sites providing such entertainment programs and view a program to their liking. Especially, when popular sports matches (e.g., U.S. Super Bowl or FIFA Word Cup) are broadcasted live over the Internet, thousands of users may watch the same game through the same website broadcasting the match. These viewers all share the excitement of the live broadcast simultaneously and may express their impression of the game through bulletin postings or other means after the game has ended. However, there are no conventional means to share their viewing experience in real-time with other viewers watching the game through the same website. Thus, there is a need for a method and apparatus for providing “virtual co-presence” to the viewers watching a broadcast program through a communication network by relaying their reactions to the broadcast program in real-time.
  • SUMMARY
  • There is provided a method and apparatus for relaying and sharing real-time reactions of viewers watching a program broadcast through a website via an online broadcasting system for providing virtual co-presence to broadcast audiences. The online broadcasting system may comprise a plurality of user terminals, a user reaction processing server and a broadcast server configured to transmit broadcast program data to the user terminals. The user terminals may be configured to: display a broadcast program based on the broadcast program data; receive individual user reaction data from a user input interface; and transmit the received individual user reaction data to the broadcast server or the user reaction processing server. The broadcast server or the user reaction processing server may be configured to: receive the individual user reaction data from each of the user terminals; aggregate the individual user reaction data to create total user reaction data based on the received individual user reaction data; generate interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and transmit the generated interface control data to at least one of the user terminals. The individual user reaction data may include a plurality of individual user reaction fields. The broadcast program may be displayed with a plurality of user reaction icons, wherein each of the user reaction icons corresponds to one of the individual user reaction fields. The user terminals may receive the individual user reaction data indicating at least one of the user reaction icons from viewers watching the broadcast program. The user terminals may receive the generated interface control data and generate at least one of the visual, auditory and tactual outputs based on the received interface control data Said total user reaction data may include a plurality of total user reaction fields, wherein each of the total user reaction fields corresponds to one of the individual user reaction fields. The total user reaction data may be generated by generating each of the total user reaction fields based on the corresponding individual user reaction fields of the received individual user reaction data. The total user reaction fields may be generated by calculating a total user reaction number for each of the user reaction fields.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a schematic diagram of an online broadcasting system according to one embodiment of the present invention.
  • FIG. 2 illustrates a schematic diagram of an online broadcasting system according to another embodiment of the present invention.
  • FIG. 3 illustrates a flow chart of a method for processing user reaction data in a user terminal according to one embodiment of the present invention.
  • FIG. 4 illustrates a flow chart of a method for generating total reaction data based on individual reaction data received from the user terminals in the broadcast server according to one embodiment of the present invention.
  • FIG. 5 illustrates an example computing system architecture, which may be used to implement embodiments of the present invention.
  • FIG. 6 illustrates example web pages showing broadcast program and audience reactions thereto.
  • DESCRIPTION OF EXAMPLE EMBODIMENT(S)
  • Various embodiments of the present invention will be described below in detail with reference to the accompanying drawings. It will be apparent, however, that these embodiments may be practiced without some or all of these specific details. In other instances, well known process steps or elements have not been described in detail so as not to unnecessarily obscure the description of the present invention.
  • FIG. 1 illustrates a schematic diagram of an online broadcasting system according to one embodiment of the present invention. Referring to FIG. 1, the online broadcasting system 100 includes a communication network 110 such as an internet, a plurality of user terminals (UT) 121, 122 and 123 connected to the communication network 110 and a broadcast server 130 broadcasting a program to the user terminals 121, 122 and 123 through the communication network 110. The user terminals 121, 122 and 123 may be personal computers, notebooks or any other terminals with appropriate communication means to provide access to online broadcast services, such as online sports broadcast service provided by the broadcast server 130, to its users. Although only three terminals 121, 122 and 123 are illustrated in FIG. 1 for ease of explanation, it should be noted that fewer or greater than three terminals may be connected to the communication network 110 to access the broadcast server 130.
  • For example, the first, second and third users may respectively log on to the broadcast server 130 through the first, second and third user terminals 121, 122 and 123 to watch the program broadcast from the broadcast server 130 through the communication network 110. After the users log on to the broadcast server 130, each of the user terminals 121, 122 and 123 may receive from the broadcast server 130 the user interface data for displaying a broadcast program display screen. The broadcast program display screen may be prepared by a client application each running on the user terminals 121, 122 and 123. An example of the broadcast program display screen is shown in FIG. 6. Referring to FIG. 6, the broadcast program display screen, such as broadcast program display screen 600, may include a broadcast program display portion 610, which may have a rectangular form, for displaying the program broadcast from the server 130. Further, the broadcast program display screen 600 may include an audience reaction portion 620 for providing the reaction of the users watching the program displayed on the broadcast program display portion 610.
  • The audience reaction portion 620 may include a user reaction input section 621 for generating individual user reaction data, which indicates the reaction of the corresponding user to the program currently being displayed on the broadcast program display portion. In one embodiment of the present invention, the user reaction input section 621 may include a plurality of user reaction icons, each indicating a predetermined reaction the users may express in response to the broadcast program. For example, the user reaction icons may include a hooting icon, a cheering icon, an applauding icon and any other icons that indicate a specific mood or action the users may have in response to the broadcast program. According to one embodiment, when one of the users selects one of the user reaction icons by clicking or placing a cursor on one of the user reaction icons displayed on the screen of the corresponding user terminal 121, 122 or 123 through a user input interface (e.g., mouse, keyboard, etc.), the corresponding user terminal 121, 122 or 123 generates an individual user reaction data based on such user input. The individual user reaction data may include a plurality of individual user reaction fields, wherein each field corresponds to one of the user reaction icons and includes a flag indicating whether the corresponding user reaction icon has been selected by the user. The flag may be set to “1” when the corresponding user reaction icon has been selected by the user. On the other hand, the flag may be set to “0” when the corresponding user reaction icon has not been selected by the user. The individual user reaction data may be periodically collected for a time interval of a predetermined length and put together as individual user reaction data representative of the time interval The corresponding user terminal 121, 122 or 123 may transmit for the time interval the generated individual user reaction data to the broadcast server 130.
  • The broadcast server 130 may receive the individual user reaction data from the user terminals 121, 122 and 123 and aggregate the received individual data to create the total user reaction data based on the received individual user reaction data. The total user reaction data reflects the reactions from all of the users in response to the program broadcast by the broadcast server 130 for a predetermined time interval. In one embodiment, the total user reaction data may contain a plurality of total user reaction fields each corresponding to one of the user reaction fields of the individual user reaction data. The broadcast server 130 may generate the total user reaction data by calculating the total user reaction number for each of the user reaction fields. For each of the user reaction fields of the total user reaction data, the user reaction number may be obtained by calculating the number of the individual user reaction data with the flag of the corresponding user reaction field set to “1.” For example, if the individual user reaction data contains the “hooting,” “cheering” and “applauding” fields (e.g., an ordered-tuple such as “hooting,” “cheering” and “applauding”) and the broadcast server 130 receives from each of the user terminals 121, 122 and 123 the individual user reaction data of (1,0,0), (1,0,0) and (0,1,0), then the broadcast server 130 may add the individual user reaction data of (1,0,0), (1,0,0) and (0,1,0) to calculate the total user reaction data as (2,1,0). The broadcast server 130 may generate for the predetermined time interval the total user reaction data to the user terminals 121, 122 and 123. Further, the broadcast server 130 may generate and transmit interface control data based on the total user reaction data to the user terminals 121, 122 and 123 through the communication network 110. The interface control data, when executed, is operative to cause the client application running on each of the user terminals 121, 122 and 123 to generate an output indicative of the total user reaction data. The format and structure of the interface control data may depend on the type of the client application and/or the operating system running on the user terminals 121, 122 and 123.
  • Each of the user terminals 121, 122 and 123 receives the interface control data and generates visual, auditory and/or tactual output based on the received interface control data. The user terminals 121, 122 and 123 may have monitors, speakers and vibrators for the visual, auditory and tactual outputs, respectively. Other means may be employed for the above sensory output. Those skilled in the art would have no difficulty in selecting the appropriate means for such purpose.
  • In one embodiment, the user terminals 121, 122 and 123 may output sounds (e.g., hooting, cheering and applauding sounds) corresponding to the received interface control data through the speakers connected thereto. For example, if the interface control data corresponding to the total user reaction data (2,1,0) is received, then a hooting sound and a cheering sound may be outputted through the speaker. The intensity of each of the sound pertaining to one of the user reaction fields may be controlled in accordance with the number indicated by the corresponding field. For the above example, the hooting sound may be outputted with twice the intensity of that of the cheering sound. This configuration accurately reflects the collective reaction of all the users watching the broadcast program.
  • Further, the user terminals 121, 122 and 123 may control the vibration of the vibrators connected thereto in accordance with the received interface control data. For example, the intensity and frequency of the vibration may be controlled according to the interface control data indicative of the total user reaction. Such vibrators may be used to provide virtual co-presence of other users watching the game, especially a person with visual or hearing impairments. Also, the vibrators may be used in conjunction with other devices to provide a richer experience to the users.
  • In addition, the user terminals 121, 122 and 123 may provide visual output through the monitors in accordance with the received interface control data. For example, referring to FIG. 6, the audience reaction portion 620 of the broadcast program display screen 600 may include a user reaction output section 622 for providing the visual output. In one embodiment, the reaction output section 622 may include one avatar, which displays certain movements in accordance with the received interface control data. The avatar may display a hooting, cheering or applauding movement. For example, if the interface control data corresponding to the total user reaction data (2,1,0) is received, then the avatar may display the hooting movement since the hooting field has the biggest number. In another embodiment, the reaction output section 622 may include a plurality of avatars, wherein each of the avatars displays one of the predetermined movements in accordance with the received interface control data For example, each of the avatars may display one of the movements corresponding to the user reaction fields with its flag set to “1” in the received total user reaction data. In the above example, when the interface control data corresponding to the total reaction data (2,1,0) is received, two (among the plurality of avatars) may display the “hooting” and “cheering” movements. Various methods other than the avatars may be used to visually convey the reaction of all the users watching the program broadcast by the broadcast server 130.
  • All of the above data and signaling messages required to transmit the data may be implemented by using known communication network protocols such as HTTP (Hyper Text Transfer Protocol). Those skilled in the art would have no difficulty in selecting and implementing an appropriate protocol to realize the aforementioned technical configuration.
  • Each of the broadcast server 130 and user terminals 121,122 and 123 may include a communication module and a control module to perform the aforementioned functions. The user terminals 121,122 and 123 may further include a user interface to receive the user inputs from the users. In particular, the communication module may be configured to communicate with other network entities connected to the communication network 110. The broadcast program data, individual user reaction data, total user reaction data and/or interface control data may be communicated to other network entities through the communication network 110 by the communication module. The control module of the broadcast server 130 may be configured to generate the total user reaction data based on the individual user reaction data received from the user terminals 121, 122 and 123. Further, the control module of the broadcast server 130 may be configured to generate the interface control data based on the total user reaction data. The control module of the user terminals 121, 122 and 123 may be configured to: receive the broadcast program data from the broadcast server through the communication module; display a broadcast program based on the broadcast program data; in response to the displayed broadcast program, receive individual user reaction data from the user interface; transmit the received individual user reaction data to a user reaction processing server or the broadcast server through the communication module; and in response to the transmitted individual user reaction data, receive interface control data from the user reaction processing server or the broadcast server.
  • FIG. 2 illustrates a schematic diagram of an online broadcasting system according to another embodiment of the present invention. The online broadcasting system 200 in FIG. 2 has a similar configuration to the online broadcasting system 100 shown in FIG. 1. Thus, each component of the online broadcasting system 200, which has substantially the same function as the counterpart shown in FIG. 1, is identified by the same reference numeral and the description thereof will be omitted herein. Further, the features different from the online broadcasting system 100 of FIG. 1 will be explained in detail in the ensuing descriptions.
  • Referring to FIG. 2, the online broadcasting system 200 further includes a user reaction processing server 250. For example, the user terminals 121, 122 and 123 may receive a program broadcast from a broadcast server 130 and transmit individual user reaction data to the user reaction processing server 250. The user reaction processing server 250 may receive the individual user reaction data from each of the user terminals 121, 122 and 123 and generate total user reaction data based on the received individual user reaction data. In generating the total user reaction data, the user reaction processing server 250 may employ various methods including those described with respect to FIG. 1. Further, the user reaction processing server 250 may generate interface control data based on the generated total user reaction data. Thereafter, the user reaction processing server 250 transmits the generated interface control data to the user terminals 121, 122 and 123. The online broadcasting system according to this embodiment relieves the traffic load of the broadcast server 130 by providing the total user reaction data and the interface control data through a separate server, i.e., the user reaction processing server 250.
  • FIG. 3 illustrates a flow chart of a method for processing user reaction data in a user terminal according to one embodiment of the present invention. Referring to FIG. 3, a user terminal such as user terminals 121, 122 and 123 (FIGS. 1 and 2) receives a broadcast program data from a broadcast server such as the broadcast server 130 (operation 300). For example, the broadcast program data may be sound, video or any other type of multimedia data. The broadcast program data may be relayed using RTP (Real-time Transport Protocol)/RTCP (RTP Control Protocol) or any other appropriate communication network protocols. In operation 310, the user terminal displays the program contained in the broadcast program data. In operation 320, the user terminal receives from user interface, such as a keyboard or a mouse, individual user reaction data of a user currently watching the program through the user terminal and transmits the received individual user reaction data to the broadcast server or a user reaction processing server such as the user reaction processing server 250 (FIG. 2) (operation 330). Thereafter, the user terminal receives the interface control data from the broadcast server or the user reaction processing server (operation 340) and outputs visual, auditory and/or tactual output based on the received interface control data (operation 350).
  • FIG. 4 illustrates a flow chart of a method for generating total reaction data based on individual reaction data received from the user terminals in the broadcast server according to one embodiment of the present invention. Referring to FIG. 4, a broadcast server such as broadcast server 132 (FIG. 1) broadcasts a program by transmitting broadcast program data to a plurality of user terminals such as the user terminals 121, 122 and 123 (FIG. 1) (operation 400). In operation 410, the broadcast server receives the individual user reaction data from the user terminals and aggregates the received user reaction data to create the total user reaction data (operation 420). In operation 430, the broadcast server generates interface control data based on the generated total user reaction data. The operation regarding the generation of the total user reaction data and the interface control data is described in the descriptions pertaining to FIGS. 1 and 2. Thus, no further descriptions thereon should be needed. In operation 440, the broadcast server transmits the generated interface control data to the plurality of user terminals.
  • While the methods and systems of the present invention have been described above with reference to specific embodiments, some or all of the elements or operations thereof may be implemented using a computer system having a general purpose hardware architecture. FIG. 5 illustrates an example computing system architecture, which may be used to implement the above-described embodiments, which may be used to perform one or more of the processes or elements described herein. In one implementation, the hardware system 500 includes a processor 502, a cache memory 504 and one or more software applications and drivers directed to the functions described herein.
  • Additionally, the hardware system 500 includes a high performance input/output (I/O) bus 506 and a standard I/O bus 508. A host bridge 510 couples the processor 502 to the high performance I/O bus 506, whereas the I/O bus bridge 512 couples the two buses 506 and 508 with each other. A system memory 514 and a network/communication interface 516 couple to the bus 506. The hardware system 500 may further include a video memory (not shown) and a display device coupled to the video memory. A mass storage 518 and I/O ports 520 are coupled to the bus 508. The hardware system 500 may optionally include a keyboard and pointing device, as well as a display device (not shown) coupled to the bus 508. Collectively, these elements are intended to represent a broad category of computer hardware systems including, but not limited to, the general purpose computer systems based on the Pentium® processor manufactured by Intel Corporation of Santa Clara, Calif., as well as any other suitable processor.
  • The elements of the hardware system 500 are described in greater detail below. In particular, the network interface 516 provides communication between the hardware system 500 and any one of a wide range of networks such as an Ethernet (e.g., IEEE 802.3) network, etc. In case of the broadcast server 130 (FIGS. 1 and 2) and the user reaction processing server 250 (FIG. 2), the network interface 516 interfaces between the hardware system 500 and the network connected to the user terminals 121, 122 and 123 for allowing the hardware system 500 to communicate with those terminals. Similarly, in case of the user terminals 121, 122 and 123 (FIGS. 1 and 2), the network interface 516 interfaces between the hardware system 500 and the network connected to the broadcast server 130 and/or the user reaction processing server 250 for allowing the hardware system 500 to communicate with those servers. The mass storage 518 provides permanent storage for the data and programming instructions to perform the above-described functions implemented in the user terminals 121, 122 and 123, the broadcast server 130 or the user reaction processing server 250, whereas the system memory 514 (e.g., DRAM) provides temporary storage for the data and programming instructions when executed by the processor 502. The I/O ports 520 are one or more serial and/or parallel communication ports that provide communication between additional peripheral devices, which may be coupled to the hardware system 500.
  • The hardware system 500 may include a variety of system architectures. Further, various components of the hardware system 500 may be rearranged. For example, the cache 504 may be on-chip with the processor 502. Alternatively, the cache 504 and the processor 502 may be packed together as a “processor module,” with the processor 502 being referred to as the “processor core.” Furthermore, certain implementations of the present invention may not require or include all of the above components. For example, the peripheral devices shown coupled to the standard I/O bus 508 may be coupled to the high performance I/O bus 506. In addition, in some implementations, only a single bus may exist, with the components of the hardware system 500 being coupled to the single bus. Furthermore, the hardware system 500 may include additional components such as additional processors, storage devices or memories. As discussed below, in one embodiment, the operations of the online broadcasting system including the online broadcasting systems 100 and 200 described herein are implemented as a series of software routines run by the hardware system 500. These software routines comprise a plurality or series of instructions to be executed by a processor in the hardware system such as the processor 502. Initially, the series of instructions are stored in a storage device such as the mass storage 518. However, the series of instructions can be stored in any suitable storage medium such as a diskette, CD-ROM, ROM, EEPROM, etc. Furthermore, the series of instructions need not be stored locally and could be received from a remote storage device, such as a server on a network, via the network/communication interface 516. The instructions are copied from the storage device, such as the mass storage 518, into the memory 514 and then accessed and executed by the processor 502.
  • An operating system manages and controls the operation of the hardware system 500, including the input and output of data to and from software applications (not shown). The operating system provides an interface between the software applications being executed on the system and the hardware components of the system. According to one embodiment of the present invention, the operating system is the Windows® 95/98/NT/XP/Vista operating system, which is available from Microsoft Corporation of Redmond, Wash. However, the present invention may be used with other suitable operating systems, such as the Apple Macintosh Operating System available from Apple Computer Inc. of Cupertino, Calif., UNIX operating systems, LINUX operating systems and the like.
  • Advantageously, the present invention provides a method and apparatus for “virtual co-presence” to the viewers watching a broadcast program through a communication network by relaying their reactions to the broadcast program in real-time. Further, while the present invention has been shown and described with respect to a preferred embodiment, those skilled in the art will recognize that various changes and modifications may be made without departing from the spirit and scope of the present invention as defined in the appended claims.

Claims (21)

1-25. (canceled)
26. A method comprising:
transmitting, by a server computer, a broadcast program to a plurality of user terminals;
receiving, by the server computer from the plurality of user terminals, individual user reaction data, the individual user reaction data generated at each of the plurality of user terminals in relation to the broadcast program, the individual user reaction data comprising user reaction to the broadcast program, the individual user reaction data comprising a plurality of individual user reaction fields, each individual user reaction field corresponding to a user reaction icon and comprising a flag indicating whether the corresponding user reaction icon has been selected by the user;
aggregating, by the server computer, the individual user reaction data to create total user reaction data based on the flags of the individual user reaction fields;
generating, by the server computer, interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and
transmitting, by the server computer, the generated interface control data to a user terminal in the plurality of user terminals.
27. The method of claim 26, wherein the user reaction icon comprises a predetermined reaction the users may express in response to the broadcast program.
28. The method of claim 26, wherein the user reaction icon comprises an icon selected from a group of icon types consisting of a hooting icon, a cheering icon, and an applauding icon.
29. The method of claim 26, wherein when a user of the user terminal selects the user reaction icon, the user terminal generates the individual user reaction data.
30. The method of claim 29, wherein the flag is set to ‘1’ when the user reaction icon has been selected by the user.
31. The method of claim 29, wherein the flag is set to ‘0’ when the user reaction icon has not been selected by the user.
32. The method of claim 26, further comprising periodically collecting the individual user reaction data for a time interval of a predetermined length.
33. The method of claim 26, wherein the total user reaction data reflects reactions from a plurality of users operating the plurality of user terminals in response to the broadcast program for a predetermined time interval.
34. The method of claim 26, wherein the aggregating of the individual user reaction data to create total user reaction data based on the flags of the individual user reaction fields further comprises generating the total user reaction data by calculating a total user reaction number for each of the user reaction fields.
35. The method of claim 34, wherein the calculating of the total user reaction number comprises calculating a number of the individual user reaction data with the flag of the corresponding user reaction field set to ‘1’.
36. A computing device comprising:
a processor;
a storage medium for tangibly storing thereon program logic for execution by the processor, the program logic comprising:
broadcast program transmitting logic executed by the processor for transmitting a broadcast program to a plurality of user terminals;
receiving logic executed by the processor for receiving, from the plurality of user terminals, individual user reaction data, the individual user reaction data generated at each of the plurality of user terminals in relation to the broadcast program, the individual user reaction data comprising user reaction to the broadcast program, the individual user reaction data comprising a plurality of individual user reaction fields, each individual user reaction field corresponding to a user reaction icon and comprising a flag indicating whether the corresponding user reaction icon has been selected by the user;
aggregating logic executed by the processor for aggregating the individual user reaction data to create total user reaction data based on the flags of the individual user reaction fields;
generating logic executed by the processor for generating interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and
control data transmitting logic executed by the processor for transmitting the generated interface control data to a user terminal in the plurality of user terminals.
37. The computing device of claim 36, wherein the user reaction icon comprises a predetermined reaction the users may express in response to the broadcast program.
38. The computing device of claim 36, wherein the user reaction icon comprises an icon selected from a group of icon types consisting of a hooting icon, a cheering icon, and an applauding icon.
39. The computing device of claim 36, wherein when a user of the user terminal selects the user reaction icon, the user terminal generates the individual user reaction data.
40. The computing device of claim 36, wherein the flag is set to ‘1’ when the user reaction icon has been selected by the user.
41. The computing device of claim 36, wherein the flag is set to ‘0’ when the user reaction icon has not been selected by the user.
42. The computing device of claim 36, wherein the total user reaction data reflects reactions from a plurality of users operating the plurality of user terminals in response to the broadcast program for a predetermined time interval.
43. The computing device of claim 36, wherein the aggregating logic further comprises user reaction data generating logic executed by the processor for generating the total user reaction data by calculating a total user reaction number for each of the user reaction fields.
44. The computing device of claim 43, wherein the calculating of the total user reaction number comprises calculating logic executed by the processor for calculating a number of the individual user reaction data with the flag of the corresponding user reaction field set to ‘1’.
45. A non-transitory computer readable storage medium having computer readable program code in the medium for causing a processor to execute computer instructions, the instructions comprising:
transmitting, by the processor, a broadcast program to a plurality of user terminals;
receiving, by the processor from the plurality of user terminals, individual user reaction data, the individual user reaction data generated at each of the plurality of user terminals in relation to the broadcast program, the individual user reaction data comprising user reaction to the broadcast program, the individual user reaction data comprising a plurality of individual user reaction fields, each individual user reaction field corresponding to a user reaction icon and comprising a flag indicating whether the corresponding user reaction icon has been selected by the user;
aggregating, by the processor, the individual user reaction data to create total user reaction data based on the flags of the individual user reaction fields;
generating, by the processor, interface control data based on the total user reaction data, wherein the interface control data, when executed, is operative to cause a client application to generate output indicative of the total user reaction data; and
transmitting, by the processor, the generated interface control data to a user terminal in the plurality of user terminals.
US14/531,053 2007-07-11 2014-11-03 Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System Abandoned US20150052540A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/531,053 US20150052540A1 (en) 2007-07-11 2014-11-03 Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2007-69641 2007-07-11
KR1020070069641A KR20090006371A (en) 2007-07-11 2007-07-11 Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system
US11/873,150 US8887185B2 (en) 2007-07-11 2007-10-16 Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system
US14/531,053 US20150052540A1 (en) 2007-07-11 2014-11-03 Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/873,150 Continuation US8887185B2 (en) 2007-07-11 2007-10-16 Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system

Publications (1)

Publication Number Publication Date
US20150052540A1 true US20150052540A1 (en) 2015-02-19

Family

ID=40254200

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/873,150 Expired - Fee Related US8887185B2 (en) 2007-07-11 2007-10-16 Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system
US14/531,053 Abandoned US20150052540A1 (en) 2007-07-11 2014-11-03 Method and System for Providing Virtual Co-Presence to Broadcast Audiences in an Online Broadcasting System

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/873,150 Expired - Fee Related US8887185B2 (en) 2007-07-11 2007-10-16 Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system

Country Status (2)

Country Link
US (2) US8887185B2 (en)
KR (1) KR20090006371A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106792167A (en) * 2016-12-08 2017-05-31 广州华多网络科技有限公司 The method of adjustment and system of the broadcast interface of online live application
WO2021120692A1 (en) * 2019-12-19 2021-06-24 广州华多网络科技有限公司 Voice gift giving method, apparatus and device, and storage medium
US11260294B2 (en) 2017-05-30 2022-03-01 Microsoft Technology Licensing, Llc Virtual controller for game injection

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8925001B2 (en) 2008-09-12 2014-12-30 At&T Intellectual Property I, L.P. Media stream generation based on a category of user expression
US20140176665A1 (en) * 2008-11-24 2014-06-26 Shindig, Inc. Systems and methods for facilitating multi-user events
FR2942928B1 (en) * 2009-03-03 2011-04-01 Alcatel Lucent METHOD AND SYSTEM FOR MULTICRITERALLY MANAGING PRESENCE NOTIFICATIONS
US11064257B2 (en) * 2011-11-07 2021-07-13 Monet Networks, Inc. System and method for segment relevance detection for digital content
US10638197B2 (en) * 2011-11-07 2020-04-28 Monet Networks, Inc. System and method for segment relevance detection for digital content using multimodal correlations
WO2013072879A2 (en) * 2011-11-16 2013-05-23 Chandrasagaran Murugan A remote engagement system
US11115720B2 (en) * 2016-12-06 2021-09-07 Facebook, Inc. Providing a live poll within a video presentation
CN110896661A (en) * 2017-07-14 2020-03-20 夏普株式会社 Information processing device, terminal device, information providing system, program for causing computer to function as information processing device, program for causing computer to function as terminal device, and method for controlling information processing device
CN109302618A (en) * 2018-11-27 2019-02-01 网易(杭州)网络有限公司 Live streaming picture rendering method, device and storage medium in mobile terminal
CN110475127A (en) * 2019-07-25 2019-11-19 天脉聚源(杭州)传媒科技有限公司 A kind of virtual auditorium generation method of 3D, system, device and storage medium
CN113923512A (en) * 2021-10-13 2022-01-11 咪咕文化科技有限公司 Method and device for processing event video of non-live audience and computing equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5726701A (en) * 1995-04-20 1998-03-10 Intel Corporation Method and apparatus for stimulating the responses of a physically-distributed audience
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US20020065826A1 (en) * 2000-07-19 2002-05-30 Bell Christopher Nathan Systems and processes for measuring, evaluating and reporting audience response to audio, video, and other content
US20070005795A1 (en) * 1999-10-22 2007-01-04 Activesky, Inc. Object oriented video system
US20070268312A1 (en) * 2006-05-07 2007-11-22 Sony Computer Entertainment Inc. Methods and systems for processing an interchange of real time effects during video communication
US20070288951A1 (en) * 2006-04-28 2007-12-13 First Data Corporation Incentives for viewing advertisements
US20080086742A1 (en) * 2006-10-09 2008-04-10 Verizon Services Corp. Systems And Methods For Real-Time Interactive Television Polling
US20080189733A1 (en) * 2006-10-31 2008-08-07 Apostolopoulos John G Content rating systems and methods
US20080320510A1 (en) * 2007-06-22 2008-12-25 Microsoft Corporation Sharing viewing statistics
US20090094286A1 (en) * 2007-10-02 2009-04-09 Lee Hans C System for Remote Access to Media, and Reaction and Survey Data From Viewers of the Media

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100366708B1 (en) * 2000-06-22 2003-01-09 주식회사 아이온커뮤니케이션즈 Internet broadcasting televiewer interest research system
JP2002140487A (en) * 2000-10-31 2002-05-17 Matsushita Electric Ind Co Ltd Device for summing up responses from data broadcasting viewer and data broadcasting system provided with device for summing up responses
KR20010067677A (en) * 2001-03-06 2001-07-13 송재영 Method of Interactive Interface with TV using Calling Identity Delivery Equipment
KR100546489B1 (en) 2004-01-17 2006-01-26 김권영 The system of stage setting for broadcasting of bidirectional tv and the method of progress on broadcasting program using that system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5726701A (en) * 1995-04-20 1998-03-10 Intel Corporation Method and apparatus for stimulating the responses of a physically-distributed audience
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US20070005795A1 (en) * 1999-10-22 2007-01-04 Activesky, Inc. Object oriented video system
US20020065826A1 (en) * 2000-07-19 2002-05-30 Bell Christopher Nathan Systems and processes for measuring, evaluating and reporting audience response to audio, video, and other content
US20070288951A1 (en) * 2006-04-28 2007-12-13 First Data Corporation Incentives for viewing advertisements
US20070268312A1 (en) * 2006-05-07 2007-11-22 Sony Computer Entertainment Inc. Methods and systems for processing an interchange of real time effects during video communication
US20080086742A1 (en) * 2006-10-09 2008-04-10 Verizon Services Corp. Systems And Methods For Real-Time Interactive Television Polling
US20080189733A1 (en) * 2006-10-31 2008-08-07 Apostolopoulos John G Content rating systems and methods
US20080320510A1 (en) * 2007-06-22 2008-12-25 Microsoft Corporation Sharing viewing statistics
US20090094286A1 (en) * 2007-10-02 2009-04-09 Lee Hans C System for Remote Access to Media, and Reaction and Survey Data From Viewers of the Media

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106792167A (en) * 2016-12-08 2017-05-31 广州华多网络科技有限公司 The method of adjustment and system of the broadcast interface of online live application
US11260294B2 (en) 2017-05-30 2022-03-01 Microsoft Technology Licensing, Llc Virtual controller for game injection
WO2021120692A1 (en) * 2019-12-19 2021-06-24 广州华多网络科技有限公司 Voice gift giving method, apparatus and device, and storage medium

Also Published As

Publication number Publication date
US8887185B2 (en) 2014-11-11
US20090019467A1 (en) 2009-01-15
KR20090006371A (en) 2009-01-15

Similar Documents

Publication Publication Date Title
US8887185B2 (en) Method and system for providing virtual co-presence to broadcast audiences in an online broadcasting system
US7809773B2 (en) Comment filters for real-time multimedia broadcast sessions
EP2834988B1 (en) Pre-fetch ads while serving ads in live stream
US20090064017A1 (en) Tuning/customization
CN105872581A (en) System and method for providing video direct broadcasting room services
CN113613027B (en) Live broadcast room recommendation method and device and computer equipment
CN113949892A (en) Live broadcast interaction method and system based on virtual resource consumption and computer equipment
CN112243137A (en) Live broadcast interface updating method, device, server and system
CN113573083A (en) Live wheat-connecting interaction method and device and computer equipment
CN114268812B (en) Live broadcast room virtual resource giving method, device, computer equipment and storage medium
CN103442288A (en) Method, device and system for processing of trans-equipment data contents
CN113573105B (en) Live broadcast interaction method based on virtual gift of screen and computer equipment
CN114630155A (en) Live broadcast interaction method, system and device based on user identity and computer equipment
CN113840155A (en) Method and system for replacing virtual gift and computer equipment
JP2009182410A (en) Viewing and listening reaction sharing system, viewing and listening reaction management server and viewing and listening reaction sharing method
CN113727136B (en) Live broadcast pushing method, system, device, equipment and storage medium
WO2010119834A1 (en) Content url announcement system
CN113891162B (en) Live broadcast room loading method and device, computer equipment and storage medium
CN114222152B (en) Virtual gift interaction method and device for urban popularization and computer equipment
JP6604092B2 (en) Broadcast content providing system, broadcast content providing device, playback device, advertisement providing device, and control method and control program thereof
WO2009107653A1 (en) Information system, information terminal, and information communication method
CN116347176A (en) Live broadcasting room game interaction method, device, system and equipment with separated front and back ends
CN114302165B (en) Live broadcast room virtual gift giving method, device, computer equipment and storage medium
CN115883911A (en) Method and device for displaying gift information of live broadcast room, electronic equipment and storage medium
CN117376626A (en) Live broadcast picture display method, device and system, storage medium and electronic equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: EXCALIBUR IP, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO! INC.;REEL/FRAME:038383/0466

Effective date: 20160418

AS Assignment

Owner name: YAHOO! INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EXCALIBUR IP, LLC;REEL/FRAME:038951/0295

Effective date: 20160531

AS Assignment

Owner name: EXCALIBUR IP, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO! INC.;REEL/FRAME:038950/0592

Effective date: 20160531

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION