US9372531B2 - Detecting an event within interactive media including spatialized multi-channel audio content - Google Patents

Detecting an event within interactive media including spatialized multi-channel audio content Download PDF

Info

Publication number
US9372531B2
US9372531B2 US13/795,877 US201313795877A US9372531B2 US 9372531 B2 US9372531 B2 US 9372531B2 US 201313795877 A US201313795877 A US 201313795877A US 9372531 B2 US9372531 B2 US 9372531B2
Authority
US
United States
Prior art keywords
event
media content
interactive media
fingerprint
media presentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/795,877
Other versions
US20140274353A1 (en
Inventor
Jeff Benson
Michael Gubman
Craig Kawahara
Bob Coover
Markus K. Cremer
Andy Mai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Citibank NA
Roku Inc
Original Assignee
Gracenote Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Assigned to GRACENOTE, INC. reassignment GRACENOTE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CREMER, MARKUS K., GUBMAN, MICHAEL, BENSON, JEFF, COOVER, BOB, KAWAHARA, CRAIG, MAI, ANDY
Priority to US13/795,877 priority Critical patent/US9372531B2/en
Application filed by Gracenote Inc filed Critical Gracenote Inc
Assigned to JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRACENOTE, INC.
Publication of US20140274353A1 publication Critical patent/US20140274353A1/en
Priority to US15/003,658 priority patent/US10055010B2/en
Publication of US9372531B2 publication Critical patent/US9372531B2/en
Application granted granted Critical
Assigned to JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT reassignment JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT SECURITY AGREEMENT Assignors: CASTTV, INC., GRACENOTE, INC., TRIBUNE BROADCASTING COMPANY, LLC
Assigned to CastTV Inc., TRIBUNE MEDIA SERVICES, LLC, TRIBUNE DIGITAL VENTURES, LLC, GRACENOTE, INC. reassignment CastTV Inc. RELEASE OF SECURITY INTEREST IN PATENT RIGHTS Assignors: JPMORGAN CHASE BANK, N.A.
Assigned to CITIBANK, N.A., AS COLLATERAL AGENT reassignment CITIBANK, N.A., AS COLLATERAL AGENT SUPPLEMENTAL SECURITY AGREEMENT Assignors: GRACENOTE DIGITAL VENTURES, LLC, GRACENOTE MEDIA SERVICES, LLC, GRACENOTE, INC.
Priority to US16/017,170 priority patent/US10156894B2/en
Priority to US16/168,412 priority patent/US10345892B2/en
Priority to US16/425,490 priority patent/US10824222B2/en
Assigned to CITIBANK, N.A. reassignment CITIBANK, N.A. SUPPLEMENTAL SECURITY AGREEMENT Assignors: A. C. NIELSEN COMPANY, LLC, ACN HOLDINGS INC., ACNIELSEN CORPORATION, ACNIELSEN ERATINGS.COM, AFFINNOVA, INC., ART HOLDING, L.L.C., ATHENIAN LEASING CORPORATION, CZT/ACN TRADEMARKS, L.L.C., Exelate, Inc., GRACENOTE DIGITAL VENTURES, LLC, GRACENOTE MEDIA SERVICES, LLC, GRACENOTE, INC., NETRATINGS, LLC, NIELSEN AUDIO, INC., NIELSEN CONSUMER INSIGHTS, INC., NIELSEN CONSUMER NEUROSCIENCE, INC., NIELSEN FINANCE CO., NIELSEN FINANCE LLC, NIELSEN HOLDING AND FINANCE B.V., NIELSEN INTERNATIONAL HOLDINGS, INC., NIELSEN MOBILE, LLC, NIELSEN UK FINANCE I, LLC, NMR INVESTING I, INC., NMR LICENSING ASSOCIATES, L.P., TCG DIVESTITURE INC., THE NIELSEN COMPANY (US), LLC, THE NIELSEN COMPANY B.V., TNC (US) HOLDINGS, INC., VIZU CORPORATION, VNU INTERNATIONAL B.V., VNU MARKETING INFORMATION, INC.
Priority to US16/947,969 priority patent/US11068042B2/en
Assigned to CITIBANK, N.A reassignment CITIBANK, N.A CORRECTIVE ASSIGNMENT TO CORRECT THE PATENTS LISTED ON SCHEDULE 1 RECORDED ON 6-9-2020 PREVIOUSLY RECORDED ON REEL 053473 FRAME 0001. ASSIGNOR(S) HEREBY CONFIRMS THE SUPPLEMENTAL IP SECURITY AGREEMENT. Assignors: A.C. NIELSEN (ARGENTINA) S.A., A.C. NIELSEN COMPANY, LLC, ACN HOLDINGS INC., ACNIELSEN CORPORATION, ACNIELSEN ERATINGS.COM, AFFINNOVA, INC., ART HOLDING, L.L.C., ATHENIAN LEASING CORPORATION, CZT/ACN TRADEMARKS, L.L.C., Exelate, Inc., GRACENOTE DIGITAL VENTURES, LLC, GRACENOTE MEDIA SERVICES, LLC, GRACENOTE, INC., NETRATINGS, LLC, NIELSEN AUDIO, INC., NIELSEN CONSUMER INSIGHTS, INC., NIELSEN CONSUMER NEUROSCIENCE, INC., NIELSEN FINANCE CO., NIELSEN FINANCE LLC, NIELSEN HOLDING AND FINANCE B.V., NIELSEN INTERNATIONAL HOLDINGS, INC., NIELSEN MOBILE, LLC, NMR INVESTING I, INC., NMR LICENSING ASSOCIATES, L.P., TCG DIVESTITURE INC., THE NIELSEN COMPANY (US), LLC, THE NIELSEN COMPANY B.V., TNC (US) HOLDINGS, INC., VIZU CORPORATION, VNU INTERNATIONAL B.V., VNU MARKETING INFORMATION, INC.
Assigned to GRACENOTE, INC., THE NIELSEN COMPANY (US), LLC reassignment GRACENOTE, INC. PARTIAL RELEASE OF SECURITY INTEREST Assignors: CITIBANK, N.A.
Assigned to ROKU, INC. reassignment ROKU, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRACENOTE, INC.
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. PATENT SECURITY AGREEMENT SUPPLEMENT Assignors: ROKU, INC.
Assigned to GRACENOTE, INC., GRACENOTE DIGITAL VENTURES, LLC reassignment GRACENOTE, INC. RELEASE (REEL 042262 / FRAME 0601) Assignors: CITIBANK, N.A.
Assigned to ROKU, INC., ROKU DX HOLDINGS, INC. reassignment ROKU, INC. TERMINATION AND RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT (REEL/FRAME 056982/0194) Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Assigned to GRACENOTE, INC., A. C. NIELSEN COMPANY, LLC, NETRATINGS, LLC, Exelate, Inc., THE NIELSEN COMPANY (US), LLC, GRACENOTE MEDIA SERVICES, LLC reassignment GRACENOTE, INC. RELEASE (REEL 053473 / FRAME 0001) Assignors: CITIBANK, N.A.
Assigned to Exelate, Inc., A. C. NIELSEN COMPANY, LLC, GRACENOTE MEDIA SERVICES, LLC, NETRATINGS, LLC, GRACENOTE, INC., THE NIELSEN COMPANY (US), LLC reassignment Exelate, Inc. RELEASE (REEL 054066 / FRAME 0064) Assignors: CITIBANK, N.A.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/73Authorising game programs or game devices, e.g. checking authenticity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/75Enforcing rules, e.g. detecting foul play or generating lists of cheating players
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Definitions

  • the subject matter disclosed herein generally relates to the processing of data. Specifically, the present disclosure addresses systems and methods to facilitate detecting an event within interactive media.
  • Media may be presented to one or more users, for example, for purposes of entertainment, merchandising (e.g., advertising), education, public service, or other communication of information.
  • Some media may be characterized as being “interactive” where the user to whom the media is presented can interact with the media itself (e.g., its content). That is, the content of the media depends, at least partly, on input from the user.
  • a computer game is an example of interactive media in which a user (e.g., game player) has at least some control over the content of the computer game (e.g., video or audio associated with achieving or failing a goal, gaining or losing an item in the game, or the appearance of the user's character or avatar within the game).
  • Software that emulates or simulates a musical instrument e.g., a drum machine or piano tutorial
  • the audio content e.g., sounds or music
  • a movie may be presented to a user (e.g., movie viewer) via a player device (e.g., a portable media player) or player software (e.g., executing on a computer).
  • a player device e.g., a portable media player
  • player software e.g., executing on a computer.
  • the player device or software may allow the user to play the movie, pause the movie, skip forward within the movie, and skip backward within the movie, the movie itself (e.g., the content of the movie) is unchangeable by the viewer.
  • playing the movie itself is not an interactive experience for the viewer.
  • an interactive experience includes the playing or triggering of non-interactive media.
  • a video “cut scene” may be automatically played between levels of a computer game.
  • the computer game is an interactive experience, for at least the reason that the player controls when one level is completed, thereby controlling when the video “cut scene” is played, even though the video “cut scene” itself is non-interactive.
  • FIG. 1 is a network diagram illustrating a network environment suitable for detecting an event within interactive media, according to some example embodiments.
  • FIG. 2 is a block diagram illustrating components of a reference server configured to facilitate detection of an event within interactive media, according to some example embodiments.
  • FIG. 3 is a block diagram illustrating components of a device configured to detect an event within interactive media, according to some example embodiments.
  • FIG. 4 is a conceptual diagram illustrating detection of an event within an interactive media presentation, according to some example embodiments.
  • FIG. 5 is a layout diagram illustrating a notification that may be displayed with an interactive media presentation, according to some example embodiments.
  • FIG. 6 is a layout diagram illustrating the notification, according to some example embodiments.
  • FIG. 7-9 are flowcharts illustrating operations of a device in performing a method of detecting an event within interactive media, according to some example embodiments.
  • FIG. 10 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
  • Example methods and systems are directed to detection of one or more events within interactive media (e.g., within a presentation of interactive media). Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
  • a device may be used to present interactive media (e.g., a game, such as a videogame) to a user of the device (e.g., a player of the game).
  • the interactive media may be stored on the device (e.g., in local memory or other storage) and presented by the device (e.g., by executing a software application, applet, or app).
  • the interactive media may be stored by a server (e.g., a game server) and provided to the device by the server (e.g., streamed live, downloaded portion by portion, or downloaded in full) for presentation by the device.
  • the interactive media may include media files that each store media content (e.g., video content, image content, or audio content), and a presentation of the interactive media may be generated, presented, or both, by the device based on user input that influences or controls which media files are included in the presentation. That is, the user input may fully or partially determine whether and when a particular media file is included in the presentation.
  • media content e.g., video content, image content, or audio content
  • a monitoring device e.g., second device
  • a monitoring device may be used to monitor the presentation of the interactive media and detect an event that occurs therein.
  • a monitoring device may be configured and positioned to access media content from the presentation of the interactive media.
  • the monitoring device may be configured and positioned to record video content (e.g., one or more video frames, which may be still images) with a camera and record audio content with a microphone.
  • the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.
  • the monitoring device may present a notification that references the occurrence of the detected event.
  • a notification may be presented to the user (e.g., via the monitoring device, the presenting device, or both).
  • the notification may be presented to another user (e.g., a socially connected friend, follower, or connection of the user, as identified by or according to a social networking system).
  • the notification is presented by the monitoring device.
  • the monitoring device may cause the presenting device (e.g., the device that presents the interactive media) to present the notification.
  • the presenting device and the monitoring device are combined into a single device.
  • the monitoring device may function entirely independent of any server or other source that may be providing the interactive media presentation to the presentation device. That is, the monitoring device may detect an event within the interactive media presentation and present a corresponding notification without communication from such a server or other source of the interactive media presentation. Further details are described below.
  • FIG. 1 is a network diagram illustrating a network environment 100 suitable for detecting an event within interactive media, according to some example embodiments.
  • the network environment 100 includes a reference server 110 , a database 115 , a social network server 118 , an interactive media presentation server 120 , and devices 130 , 131 , and 150 , all of which may be communicatively coupled to each other via a network 190 .
  • the interactive media presentation server 120 is communicatively coupled to the device 130 by a separate network or other communication path.
  • the reference server 110 , the database 115 , the social network server 118 , the interactive media presentation server 120 , and the devices 130 , 131 , and 150 may each be implemented in a computer system, in whole or in part, as described below with respect to FIG. 10 .
  • the reference server 110 with or without the database 115 , may form all or part of a network-based system 105 .
  • the network-based system 105 may be or include a cloud-based system that provides one or more network-based services (e.g., provision of reference identifiers for media content included as part of various interactive media).
  • users 132 and 152 are also shown in FIG. 1 .
  • One or both of the users 132 and 152 may be a human user (e.g., a human being), a machine user (e.g., a computer configured by a software program to interact with the device 130 ), or any suitable combination thereof (e.g., a human assisted by a machine or a machine supervised by a human).
  • the user 132 is not part of the network environment 100 , but is associated with the device 130 and may be a user of the device 130 .
  • the device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 132 .
  • the user 132 may also be associated with the device 131 and may be a user of the device 131 .
  • the device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 132 .
  • the device 131 is able to monitor (e.g., by accessing or receiving) media content presented as part of an interactive media presentation by the device 130 .
  • the device 131 and the device 130 are combined into a single device. In such example embodiments, the monitoring of the media content may be performed internally by the single device (e.g., within memory).
  • the user 152 is not part of the network environment 100 , but is associated with the device 150 .
  • the user 152 is a socially connected friend, follower, or connection of the user 132 (e.g., as identified or indicated by a social networking service, such as Facebook® or Twitter®).
  • the device 150 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 152 .
  • any of the machines, databases, or devices shown in FIG. 1 may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine, database, or device.
  • a computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. 10 .
  • a “database” is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database (e.g., an object-relational database), a triple store, a hierarchical data store, or any suitable combination thereof.
  • any two or more of the machines, databases, or devices illustrated in FIG. 1 may be combined into a single machine, and the functions described herein for any single machine, database, or device may be subdivided among multiple machines, databases, or devices.
  • the network 190 may be any network that enables communication between or among machines, databases, and devices (e.g., the reference server 110 and the device 131 ). Accordingly, the network 190 may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network 190 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
  • FIG. 2 is a block diagram illustrating components of the reference server 110 , which may be configured to facilitate detection of an event within interactive media, according to some example embodiments.
  • the reference server 110 may be a machine that, as shown, includes a media access module 210 , a fingerprint generation module 220 , a watermark extraction module 230 , and a notification correlation module 240 , all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
  • any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software.
  • any module described herein may configure a processor to perform the operations described herein for that module.
  • any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules.
  • modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
  • the media access module 210 of the reference server 110 is configured to access media (e.g., from the database 115 , from the interactive media presentation server 120 , or from both).
  • the accessed media may include one or more media files containing media content that may be presentable as part of the interactive media presentation.
  • the interactive media presentation server 120 may store such media files, and the media access module 210 of the reference server 110 may access or retrieve those media files.
  • the database 115 may be used by the media access module 210 to temporarily or permanently store the media files (e.g., for fingerprint generation, watermark extraction, or both).
  • the fingerprint generation module 220 of the reference server 110 is configured to generate reference fingerprints from the media files that are accessed by the media access module 210 .
  • the fingerprint generation module 220 may apply one or more algorithms to a video file and generate a reference fingerprint that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation.
  • the fingerprint generation module 220 may apply one or more algorithms to an audio file and thus generate a reference fingerprint usable to identify a playing of that audio file within the interactive media presentation.
  • the fingerprint generation module 220 may apply one or more algorithms to an image file and accordingly generate a reference fingerprint usable to identify a displaying of that image file within the interactive media presentation.
  • the fingerprint generation module 220 may apply one or more algorithms to a text file and thereby generate a reference fingerprint that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
  • the watermark extraction module 230 of the reference server 110 is configured to extract reference watermarks from the media files that are accessed by the media access module 210 .
  • the watermark extraction module 230 may apply one or more algorithms to a video file and extract a reference watermark that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation.
  • the watermark extraction module 230 may apply one or more algorithms to an audio file and thus extract a reference watermark usable to identify a playing of that audio file within the interactive media presentation.
  • the watermark extraction module 230 may apply one or more algorithms to an image file and accordingly extract a reference watermark usable to identify a displaying of that image file within the interactive media presentation.
  • watermark extraction module 230 may apply one or more algorithms to a text file and thereby extract the reference watermark that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
  • the reference server 110 may form all or part of a cloud-based server system (e.g., of one or more machines) that is configured to generate fingerprints from various media content, store watermarks for various media content, or any suitable combination thereof.
  • a cloud-based server system e.g., of one or more machines
  • the notification correlation module 240 of the reference server 110 is configured to correlate a media file (e.g., accessed by the media access module 210 and processed by the fingerprint generation module 220 , the watermark extraction module 230 , or both) with a notification that references an event which may occur within the interactive media presentation.
  • a media file e.g., accessed by the media access module 210 and processed by the fingerprint generation module 220 , the watermark extraction module 230 , or both
  • the media file may contain video content that shows an in-game character congratulating the user (e.g., game player) on completing a difficult level of the game.
  • the completion of the difficult level of the game is the event that may occur within the interactive presentation, and this event, this media file, or both may be correlated with a notification that references the completion of this level of the game.
  • the notification correlation module 240 may access event data that correlates the event with the media file (e.g., from the interactive media presentation server, from the database 115 , or from both). Based on such event data, the notification correlation module 240 may map the event, the media file, or both, to the corresponding notification, which may be stored in the database 115 (e.g., after being automatically or manually generated based on the media file). This correspondence relationship (e.g., map) may be stored in the database 115 .
  • the reference server 110 may be configured to provide (e.g., to any one or more of devices 130 , 131 , and 150 ) a reference identifier (e.g., fingerprint or watermark) of the media file and a notification that corresponds to an event signified by the media file being presented within the interactive media presentation.
  • the reference identifier and the notification may be provided as part of a network-based service that supplements the interactive media presentation with additional information (e.g., the notification) upon detection of the event occurring.
  • the network-based system 105 provides such a service.
  • such a service may be provided without any cooperation, assistance, or other communication from the interactive media presentation server 120 or other source of the interactive media presentation.
  • some or all of the network-based system 105 may obtain and provide reference identifiers of various media files and the media content thereof, by accessing such media content (e.g., as a user) during a presentation of the interactive media presentation.
  • reference identifiers may be obtained from playing a computer game (e.g., to completion, automatically, by executing software scripts to simulate user input), and corresponding notifications may be generated (e.g., automatically or manually) and mapped to the obtained reference identifiers.
  • the network-based system 105 may provide a supplemental information service that complements the interactive media presentation, but is separate from the interactive media presentation and produced independently (e.g., without collaboration with the author or source of the interactive media presentation).
  • FIG. 3 is a block diagram illustrating components of the device 131 , which may be configured to detect an event within interactive media, according to some example embodiments.
  • the device 131 may be a machine that, as shown, includes a reference module 310 , a fingerprint module 320 , a watermark module 330 , a detection module 340 , and a presentation module 350 , all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
  • any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software.
  • any module described herein may configure a processor to perform the operations described herein for that module.
  • any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules.
  • modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
  • the device 131 may monitor an interactive media presentation being presented by the device 130 (e.g., by recording or otherwise accessing presented media content included in the interactive media presentation). Media content being presented as part of the interactive media presentation may thus be accessed by the device 131 (e.g., for detection of the event that corresponds to the media content).
  • the device 131 may include a camera that is configured to capture video content, image content, text content, or any suitable combination thereof, that appears in the interactive media presentation.
  • the device 131 may include a microphone is configured to capture audio content that is played within the interactive media presentation.
  • the reference module 310 of the device 131 is configured to access a reference identifier (e.g., reference fingerprint or reference watermark) that is generated or extracted (e.g., by the reference server 110 ) from media content which is presentable as part of the interactive media presentation.
  • the reference identifier may be accessed from the network-based system 105 (e.g., from the reference server 110 or from the database 115 ).
  • the reference identifier may be denoted herein as a “first identifier.”
  • an event that corresponds to the media content may be configured to occur in response to a user input (e.g., generated by the user 132 and submitted via the device 130 , as the user 132 is interacting with the interactive media presentation).
  • the fingerprint module 320 of the device 131 is configured to generate a fingerprint from a playback of the media content as part of the interactive media presentation.
  • the fingerprint module 320 may apply one or more algorithms to a video file and generate a fingerprint that may be compared to a reference fingerprint for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation.
  • the fingerprint module 320 may apply one or more algorithms to an audio file and thus generate a fingerprint that may be compared to a reference fingerprint for that audio file and thereby identify a playing of that audio file within the interactive media presentation.
  • the fingerprint module 320 may apply one or more algorithms to an image file and accordingly generate a fingerprint that may be compared to a reference fingerprint for that image file and thereby identify a displaying of that image file within the interactive media presentation.
  • the fingerprint module 320 may apply one or more algorithms to a text file and thereby generate a fingerprint that may be compared to a reference fingerprint for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
  • the watermark module 330 of the device 131 is configured to extract a watermark from the playback of the media content as part of the interactive media presentation.
  • the watermark module 330 may apply one or more algorithms to a video file and extract a watermark that may be compared to a reference watermark for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation.
  • the watermark module 330 may apply one or more algorithms to an audio file and thus extract a watermark that may be compared to a reference watermark for that audio file and thereby identify a playing of that audio file within the interactive media presentation.
  • the watermark module 330 may apply one or more algorithms to an image file and accordingly extract a watermark that may be compared to a reference watermark for that image file and thereby identify a displaying of that image file within the interactive media presentation.
  • the watermark module 330 may apply one or more algorithms to a text file and thereby extract a watermark that may be compared to a reference watermark for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
  • the fingerprint module 320 and the watermark module 330 may be included in the device 131 .
  • the device 131 may be configured to generate fingerprints from various media content monitored by the device 131 , extract watermarks for such media content, or any suitable combination thereof.
  • the detection module 340 of the device 131 is configured to detect an occurrence of the event that corresponds to the media content.
  • the event may be configured to occur in response to a user input (e.g., from the user 132 ).
  • the detection of this occurrence of the event may be based on the identifier generated by the device 131 matching the reference identifier accessed by the device 131 .
  • the occurrence of the event may be detected based on a match between a fingerprint generated by the fingerprint module 320 and a reference fingerprint accessed by the reference module 310 .
  • the occurrence of the event may be detected based on a match between a watermark extracted by the watermark module 330 and a reference watermark accessed by the reference module 310 .
  • the presentation module 350 of the device 131 is configured to present a notification (e.g., accessed from the network-based system 105 ) that references the occurrence of the event.
  • the notification may be presented based on the detecting of the event's occurrence, based on the identifier generated by the device 131 matching the reference identifier accessed by the device 131 , or based on both.
  • FIG. 4 is a conceptual diagram illustrating detection of an event 432 within an interactive media presentation 430 , according to some example embodiments.
  • the interactive media presentation 430 is generated, presented, or both (e.g., by the device 130 ) based on user input 420 and based on media 405 .
  • the user input 420 may be received from the device 130 that is presenting the interactive media presentation 430 .
  • the interactive media presentation 430 may be a game (e.g., multimedia game) presented by game software that is executing on the device 130 , and the user input 420 may be or include control signals, commands, or choices generated by the user 132 as part of playing the game.
  • the media 405 may take the form of media files that contain media content 410 , 412 , 414 , and 416 .
  • media content e.g., media content 410
  • Examples of media content include video, an image, audio, text, or any suitable combination thereof.
  • the arrows pointing to the interactive media presentation 430 from the user input 420 and from the media 405 indicate that the interactive media presentation 430 is influenced, at least in part, by the user input 420 and the media 405 .
  • the event 432 occurs within the interactive media presentation 430 (e.g., as a result of the user input 420 ). Because the event 432 is occurring, the interactive media presentation 430 includes (e.g., incorporates) the media content 410 , which signifies the occurrence of the event 432 . As a result, a playback of the media content 410 is initiated by the interactive media presentation 430 (e.g., via the device 130 ).
  • a detection 440 of the event 432 that corresponds to the media content 410 may be performed (e.g., by the device 131 ).
  • the device 131 may be configured by event detection software that executes on the device 131 and performs the detection 440 .
  • a method discussed below with respect to FIG. 7-9 may be implemented by such event detection software.
  • FIG. 5 is a layout diagram illustrating a notification 520 that may be displayed with (e.g., within) an interactive media presentation, according to some example embodiments.
  • a user interface 500 is depicted in the example form of a graphical window. Such a graphical window may be displayed on a display screen of the device 130 , while the device 130 is presenting the interactive media presentation 430 (e.g., within the user interface 500 ).
  • the interactive media presentation 430 is a game (e.g., titled “Majestic Fantasy 4: The Unkempt Realms”), and the user interface 500 is used to present various media content of the game (e.g., media content 410 ).
  • the notification 520 may be presented.
  • FIG. 5 depicts the notification 520 being presented within the user interface 500 .
  • the notification 520 may be presented outside the user interface 500 (e.g., elsewhere on a display screen of the device 130 , on a display screen of the device 131 , or on a display screen of the device 150 ).
  • the notification 520 references the event 432 (e.g., the occurrence of the event 432 within the interactive media presentation 430 ).
  • some example embodiments of the user interface 500 include a help button 510 (e.g., labeled “Get Help for This Level”).
  • the help button 510 appears in the user interface 500 in response to the detection 440 of the event 432 .
  • the event 432 may be an in-game defeat of the user 132 (e.g., player), and the detection 440 of such a defeat may cause the help button 510 to appear within the user interface 500 . If the user 132 clicks on the help button 510 , the notification 520 appears within the user interface 500 (e.g., to provide information that may be helpful to avoid another such defeat).
  • the event 432 may be an in-game promotion of the user 132 to a more difficult level of the game, and the detection 440 of such a promotion may trigger the appearance of the help button 510 .
  • the notification 520 may be shown (e.g., to provide strategy for playing the more difficult level of the game).
  • FIG. 6 is a layout diagram illustrating the notification 520 , according to some example embodiments.
  • the notification references the event 432 within the interactive media presentation 430 (e.g., references the occurrence of the event 432 ), and according to various example embodiments, the notification 520 may be presented to one or more of the users 132 and 152 (e.g., via one or more of the devices 130 , 131 , and 150 ).
  • the notification 520 may include any information that is pertinent to the occurrence of the event 432 within the interactive media presentation 430 .
  • the notification 520 may include a map 610 (e.g., an in-game map of a player's current level within a game).
  • the notification 520 may include one or more pieces of information 620 , 630 , 640 , and 650 , which each may reference the occurrence of the event 432 .
  • Examples of such information include a help document, a guide, an encouragement (e.g., to a player of a game, that the player persevere in attempting to win a difficult section or level of the game), a suggestion (e.g., that the player of the game attempt a different strategy), or an advertisement (e.g., that the player purchase a virtual in-game item that may enhance the player's enjoyment of the current section or level of the game).
  • Additional examples of such information include an offer for the purchase of one or more virtual goods (e.g., as an in-game purchase, an in-app purchase, downloadable level content (DLC), or any suitable combination thereof), as well as some or all of a user interface (e.g., an electronic storefront) operable to initiate such a purchase.
  • a help document e.g., a guide, an encouragement (e.g., to a player of a game, that the player persevere in attempting to win a difficult section or level of the game), a suggestion (e.g.
  • Such information include an offer for the purchase of one or more physical goods (e.g., related or recommended merchandise, games, memorabilia, soundtracks, or other physical items), as well as some or all of a user interface operable to initiate such a purchase.
  • the notification 520 includes a hyperlink to such information.
  • some or all of the information 620 , 630 , 640 , or 650 may refer to an achievement that is signified by the event 432 occurring within the interactive media presentation 430 (e.g., completion of a level in a game, as signified by a video “cut scene” that appears at the end of the level).
  • Any of the information 620 , 630 , 640 , or 650 may describe a virtual item in a virtual world (e.g., an upgraded sword within a fantasy adventure game, as signified by special music that plays upon acquisition of the upgraded sword).
  • the information 620 , 630 , 640 , or 650 may be included in the notification 520 based on a level of progress within a storyline of the interactive media presentation (e.g., a plot of a game). Similarly, any of the information 620 , 630 , 640 , or 650 may be included based on a level of advancement attained by the user 132 (e.g., within a character arc of an in-game character or avatar of the user 132 ). Accordingly, any of the information 620 , 630 , 640 , or 650 may refer to such a level of progress or level of advancement.
  • some or all of the information 620 , 630 , 640 , or 650 corresponds to a virtual location, virtual orientation, or both, within a virtual world.
  • the event 432 may be an arrival of the player at a particular location within the virtual world (e.g., a waterfall that hides a treasure chest), and the event 432 may be detected by the corresponding playback of the media content 410 (e.g., a particular audio pattern of water splashing sounds, a particular visual pattern of rock formations, or both).
  • any of the information 620 , 630 , 640 , or 650 may reference that particular location (e.g., the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Look behind the waterfall to find treasure!” or “If you want to search behind the waterfall, would you like to buy some goggles or an umbrella?”).
  • the media content 410 may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world. Accordingly, the event 432 may be detected by a corresponding playback of the media content 410 with a particular virtual orientation (e.g., a particular audio pattern of water splashing sounds whose frequency distribution indicates that the player is facing towards the waterfall that hides the treasure chest, a particular visual pattern of rock formations indicating that the player is facing the waterfall, or both). Based on this, any of the information 620 , 630 , 640 , or 650 may reference that particular orientation (e.g., facing towards the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Don't get distracted by the waterfall! Enemies may be lurking behind you!” or “If you want to search behind the waterfall, would you like to hire a helper to watch your back?”).
  • a particular virtual orientation e.g., a particular audio pattern of water splashing sounds whose frequency distribution indicates that
  • the notification 520 may include a reference 660 to the user 132 .
  • the users 132 and 152 may be socially connected to each other (e.g., as friends, followers, or connections, such as may be indicated by a social networking service). Accordingly, the notification 520 with the reference 660 may notify the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430 .
  • the notification 520 may thus tell the user 152 that the user 132 has been defeated in playing a game, has been promoted to a more difficult level of the game, has acquired a virtual object within the game, has advanced to a particular point in the game's storyline, or has arrived at a particular virtual location within a virtual world in which the game is played.
  • FIG. 7-9 are flowcharts illustrating operations of the device 131 in performing a method 700 of detecting the event 432 within the interactive media presentation 430 , according to some example embodiments.
  • Operations in the method 700 may be performed by the device 131 (e.g., separately from the device 130 , or sharing one or more operations with the device 130 ), using modules described above with respect to FIG. 3 .
  • the method 700 includes operation 710 , 720 , 730 , and 740 .
  • the reference module 310 of the device 131 accesses a first identifier (e.g., a first fingerprint or a first watermark).
  • the first identifier may be a reference identifier and may be accessed from the database 115 , and the first identifier may be obtained from the media content 410 , which is presentable as part of the interactive media presentation 430 .
  • the reference module 310 accesses a first fingerprint generated from the media content 410 (e.g., by the fingerprint generation module 220 of the reference server 110 ).
  • the reference module 310 may access a first watermark extracted from the media content 410 (e.g., by the watermark extraction module 230 of the reference server 110 ).
  • the fingerprint module 320 of the device 131 generates a second identifier (e.g., a second fingerprint) from a playback of the media content 410 as part of the interactive media presentation 430 .
  • the second identifier may be called a generated identifier.
  • the watermark module 330 of the device 131 extracts the second identifier (e.g., a second watermark) from the playback of the media content 410 as part of the interactive media presentation 430 .
  • the detection module 340 of the device 131 detects an occurrence of the event 432 within the interactive media presentation 430 . This detection may be based on the second identifier (e.g., generated identifier) matching the first identifier (e.g., reference identifier). For example, the detection module 340 may detect the occurrence of the event 432 by comparing an accessed first fingerprint (e.g., reference fingerprint) of the media content 410 to a generated second fingerprint (e.g., generated fingerprint) of the media content 410 . Based on the accessed first fingerprint matching the generated second fingerprint, the detection module 340 may detect the occurrence of the event 432 .
  • the second identifier e.g., generated identifier
  • the detection module 340 may detect the occurrence of the event 432 by comparing an accessed first fingerprint (e.g., reference fingerprint) of the media content 410 to a generated second fingerprint (e.g., generated fingerprint) of the media content 410 . Based on the accessed first fingerprint matching the generated second
  • the detection module 340 may detect the occurrence of the event 432 by comparing an accessed first watermark (e.g., reference watermark) of the media content 410 to an extracted second watermark (e.g., extracted watermark) of the media content 410 . Based on the accessed first watermark matching the extracted second watermark, the detection module 340 may detect the occurrence of the event 432 .
  • an accessed first watermark e.g., reference watermark
  • an extracted second watermark e.g., extracted watermark
  • the presentation module 350 of the device 131 presents the notification 520 , which may reference the occurrence of the event 432 within the interactive media presentation 430 .
  • the event 432 may be detected based on the second identifier being determined to match the first identifier (e.g., in operation 730 ).
  • the presentation module 350 may present the notification 520 on a display screen of the device 131 (e.g., within an alert or other message).
  • the notification 520 may be presented to the user 132 on the device 131 , which may be functioning as a supplementary device that monitors the interactive media presentation 430 and provides notifications (e.g., notification 520 ) in response to detected events (e.g., event 432 ) therein.
  • the presentation module 250 of the device 131 causes the device 130 to present the notification 520 (e.g., within the user interface 500 , in which the interactive media presentation 430 may be presented). Accordingly, the notification 520 may be presented to the user 132 on the device 130 , which may be both presenting the interactive media presentation 430 and providing notifications (e.g., notification 520 ) in response to detected events (e.g., event 432 ) therein.
  • the presentation module 350 causes the device 150 to present the notification 520 (e.g., within an alert or other message).
  • the notification 520 may be presented with the reference 660 (e.g., to the user 132 ).
  • the notification 520 may be presented to the user 152 on the device 150 , which may notify the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430 , as indicated by detection of the playback of the media content 410 as part of the interactive media presentation 430 .
  • the method 700 may include one or more of operations 802 , 810 , 820 , 826 , and 828 .
  • Operation 802 may be performed prior to operation 710
  • operation 710 may be performed in response to operation 802 .
  • the detection module 340 of the device 131 receives a request that a current portion of the interactive media presentation 430 be identified.
  • the detection module 340 may detect that the help button 510 (e.g., labeled “Get Help For This Level”) has been activated (e.g., clicked or touched), where activation of the help button 510 initiates such a request to identify a current portion (e.g., a current level of a game) of the interactive media presentation 430 .
  • a current portion e.g., a current level of a game
  • one or more of operations 710 , 720 , and 740 may be performed based on the received request.
  • the presenting of the notification 520 in operation 740 may identify the current portion of the interactive media presentation 430 (e.g., the current level of the game).
  • the current portion of the interactive media presentation 430 may be named in the notification 520 (e.g., within information 620 ) or shown in the notification 520 (e.g., within the map 610 ).
  • the media content 410 is unique within the interactive media presentation 430 , and the event 432 corresponds exclusively to the media content 410 .
  • occurrence of the event 432 is always accompanied by a playback of the media content 410 , and a playback of the media content 410 always signifies the occurrence of the event 432 .
  • the media content 410 may be a video (e.g., a “cut scene”) that is played only between Level 3 and Level 4 (e.g., upon completion of Level 3) in a multi-level computer game.
  • the media content 410 may be a sound (e.g., a trumpet fanfare) that is played only whenever a player is promoted to a higher rank in a military simulation game.
  • the detecting of the occurrence of the event 432 in operation 730 may be performed based simply on the second identifier (e.g., generated fingerprint) matching the first identifier (e.g., reference fingerprint).
  • the media content 410 is not unique within the interactive media presentation 430 .
  • the event 432 may correspond nonexclusively to the media content 410 .
  • a matching of the second identifier to the first identifier in operation 730 may be insufficient to detect the event 432 .
  • one or more additional comparisons of generated identifiers to reference identifiers may be used by the device 131 to detect the event 432 .
  • the reference module 310 of the device 131 accesses a third identifier in a manner similar to that described above with respect to operation 710 .
  • the third identifier may be a further reference identifier and may be accessed from the database 115 , and the third identifier may be obtained from the media content 412 , which is presentable as part of the interactive media presentation 430 .
  • the reference module 310 accesses a third fingerprint generated from the media content 412 (e.g., by the fingerprint generation module 220 of the reference server 110 ).
  • the reference module 310 may access a third watermark extracted from the media content 412 (e.g., by the watermark extraction module 230 of the reference server 110 ).
  • the fingerprint module 320 of the device 131 generates a fourth identifier (e.g., a fourth fingerprint) from a playback of the media content 412 as part of the interactive media presentation 430 .
  • the fourth identifier may be called a further generated identifier.
  • the watermark module 330 of the device 131 extracts the fourth identifier (e.g., a fourth watermark) from the playback of the media content 412 as part of the interactive media presentation 430 .
  • the detecting of the occurrence of the event 432 in operation 730 may be based on the third identifier (e.g., further reference identifier) matching the fourth identifier (e.g., further generated identifier).
  • the detection module 340 may detect the occurrence of the event 432 by comparing accessed first and third fingerprints (e.g., reference fingerprints) to generated second and fourth fingerprints (e.g., generated fingerprints). Based on the accessed first fingerprint matching the generated second fingerprint and the accessed third fingerprint matching the generated fourth fingerprint, the detection module 340 may detect the occurrence of the event 432 .
  • the detection module 340 may detect the occurrence of the event 432 by comparing accessed first and third watermarks (e.g., reference watermarks) to extracted second and fourth watermarks (e.g., extracted watermarks). Based on the accessed first watermark matching the extracted second watermark and the third watermark matching the fourth watermark, the detection module 340 may detect the occurrence of the event 432 .
  • accessed first and third watermarks e.g., reference watermarks
  • extracted second and fourth watermarks e.g., extracted watermarks
  • the media content 410 is not unique within the interactive media presentation 430 , but the event 432 corresponds exclusively to the playback of the media content 410 being contemporaneous (e.g., within a five-second window) with the playback of the media content 412 (e.g., further media content).
  • the detecting of the occurrence of the event 432 in operation 730 may be based on the event 432 corresponding exclusively to the contemporaneous playback of the media content 410 with the media content 412 .
  • the event 432 may be detected based on the event 432 corresponding exclusively to the fact that the playback of the media content 410 is contemporaneous with the playback of the media content 412 .
  • the media content 410 is not unique within the interactive media presentation 430 , but the event 432 corresponds nonexclusively to the playback of the media content 410 being contemporaneous (e.g., within a two-second window) with the playback of the media content 412 (e.g., further media content).
  • the detecting of the occurrence of the event 432 in operation 730 may be based on a probability that the playback of the media content 410 is contemporaneous with the playback of the media content 412 .
  • the detection module 340 of the device 131 accesses such a probability (e.g., from the network-based system 105 or any portion thereof).
  • the detection module 340 may access a 90% probability that the event 432 is accompanied by a contemporaneous playback of the media content 410 with the media content 412 . Accordingly, the detection module 340 may perform operation 730 based on the accessed probability.
  • the detecting of the occurrence of the event 432 in operation 730 may be based on a history of events that occurred within the interactive media presentation 430 prior to the playback of the media content 412 within the interactive media presentation 430 .
  • the detection module 340 of the device 131 accesses such a history of detected events (e.g., from the network-based system 105 or any portion thereof).
  • the detection module 340 may access a log of events that have been previously detected by the device 131 while monitoring the interactive media presentation 430 , and the log of events may indicate that other events (e.g., aside from the event 432 ) signified by a playback of the media content 410 , the media content 412 , or both, have already been detected as already having occurred.
  • the other events that have already occurred may be eliminated as potential candidates for detection in operation 730 .
  • the detection module 340 may perform operation 730 based on the accessed history of detected events.
  • the method 700 may include one or more of operations 840 , 841 , 842 , 843 , 844 , 845 , 846 , and 850 .
  • One or more of operations 840 - 846 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 740 , in which the presentation module 350 of the device 131 presents the notification 520 .
  • the event 432 includes (e.g., indicates or signifies) an achievement within a game by a player of the game (e.g., an in-game achievement by the user 132 ). Examples of such an achievement include completing a level of the game, performing a particular set of tasks within the game, gaining access to a feature of the game (e.g., discovering or unlocking hidden content or “Easter eggs”), or any suitable combination thereof.
  • the presentation module 350 presents information 630 , which may reference the achievement by the player.
  • the information 630 may include a congratulatory message that mentions the achievement.
  • the information 630 may include a suggestion that the user 132 (e.g., player) share the achievement with a socially connected friend (e.g., by sending a message that references the achievement).
  • the event 432 includes (e.g., indicates or signifies) an acquisition of a virtual item within a virtual world.
  • an acquisition include obtaining a significant talisman or weapon (e.g., within a fantasy adventure game that is set within a three-dimensional virtual world), gaining access to an upgrade to an existing virtual item (e.g., a faster car in a racing game), receiving a large sum of virtual money (e.g., treasure or prizes), or any suitable combination thereof.
  • the presentation module 350 presents information 630 , which may reference the acquisition of the virtual item.
  • the information 630 may include a congratulatory message that mentions the acquisition.
  • the information 630 may include a suggestion that the user 132 (e.g., player) shares news of the acquisition with a socially connected friend (e.g., by sending a message that references the acquisition).
  • the event 432 includes (e.g., indicates or signifies) a level of progress within a storyline of a game.
  • the media content 410 may include video content, audio content, or both, that indicates the level of progress (e.g., special music that is specific to a particular section of the storyline).
  • the presentation module 350 presents information 640 , which may be based on, and may reference, the level of progress within the storyline.
  • the information 640 may include a summary of the storyline up to the current level of progress, a preview of the next level of progress in the storyline, advice, tips, suggestions, encouragements, or any suitable combination thereof.
  • the event 432 includes (e.g., indicates or signifies) a level of advancement within the game by a player of the game (e.g., a new in-game rank attained by the user 132 ).
  • the media content 410 may include video content, audio content, or both, that indicates the level of advancement (e.g., special insignia presented on the screen or special sound effects).
  • the presentation module 350 presents information 640 , which may be based on, and may reference, the level of advancement within the game.
  • the information 640 may include a description of a rank to which the player has been promoted, a description of new abilities or powers accorded to the level of advancement, an indication of progress toward the next level of advancement, indication of effort (e.g., measured in time, actions, or both) expended in reaching the level of advancement, or any suitable combination thereof.
  • the media content 410 includes content (e.g., audio content or video content) that indicates a virtual location within a virtual world.
  • the notification 520 may include information (e.g., information 650 ) that corresponds to a virtual location, virtual orientation, or both, within a virtual world.
  • the presentation module 350 of the device 131 presents information (e.g., information 650 ) that corresponds to (e.g., describes or references) the virtual location within the virtual world.
  • the media content 410 may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world.
  • the presentation module 350 presents information (e.g., information 650 ) that corresponds to the virtual orientation (e.g., at the virtual location) within the virtual world.
  • the notification 520 may include information (e.g., information 620 ) that contains a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof.
  • the presentation module 350 of the device 151 presents a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof, in performance of operation 740 .
  • the media content 410 indicates a failure to achieve a goal within the interactive media presentation 430 (e.g., a failure by a player of a videogame to achieve a goal within the game).
  • operation 845 may involve presenting a suggestion on achieving the goal (e.g., in the next attempt or some future attempt), an encouragement to the player (e.g., to the user 132 , that the user 132 try again to achieve a goal), or an advertisement for a virtual item within the interactive media presentation 430 (e.g., a purchasable virtual item that may facilitate achieving the goal), or any suitable combination thereof.
  • a suggestion on achieving the goal e.g., in the next attempt or some future attempt
  • an encouragement to the player e.g., to the user 132 , that the user 132 try again to achieve a goal
  • an advertisement for a virtual item within the interactive media presentation 430 e.g., a purchasable virtual item that may facilitate achieving the goal
  • the presentation module 350 of the device 131 communicates the notification 520 with the reference 660 , which may describe the user 132 .
  • the notification 520 may be communicated to the user 152 (e.g., via the device 150 ), who may be socially connected (e.g., as a friend, follower, or connection) to the user 132 via one or more social networking services. This may have the effect of notifying the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430 .
  • Operation 850 may be performed after operation 730 , in which the detection module 340 of the device 131 detects the occurrence of the event 432 . As shown in FIG. 9 , operation 850 may follow operation 740 , in which the presentation module 350 of the device 131 presents the notification 520 . In operation 850 , the detection module 340 stores a reference to the occurrence of the event 432 . The reference may be stored in a data record (e.g., within the database 115 ) that corresponds to the user 132 (e.g., a player of a videogame) from whom the user input 420 may be received.
  • a data record e.g., within the database 115
  • the user 132 e.g., a player of a videogame
  • the user input 420 may be a basis (e.g., an influence, a factor, or a control signal) for the interactive media presentation 430 .
  • the stored reference may be usable to identify a portion (e.g., section, chapter, level, or part) of the interactive media presentation 430 that contains the event 432 . That is, the stored reference may indicate, designate, or define the portion within which the event 432 is configured to occur (e.g., in response to the user input 420 ) within the interactive media presentation 430 . This may have the effect making annotation that the user 132 has been presented with this portion of the interactive media presentation 430 .
  • operation 850 results in the detection of the event 432 within the interactive media presentation 430 being recorded (e.g., in the database 115 ) in a game history of the user 132 .
  • one or more of the methodologies described herein may facilitate detection of an event within a presentation of interactive media. Moreover, one or more of the methodologies described herein may facilitate presentation of a notification that references an event within such interactive media. Hence, one or more the methodologies described herein may facilitate provision of a supplementary information service that complements interactive media, independently, with or without communication from a source of the interactive media.
  • one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in detecting events within interactive media presentations. Efforts expended by a user in identifying a current portion of an interactive media presentation may be reduced by one or more of the methodologies described herein. Computing resources used by one or more machines, databases, or devices (e.g., within the network environment 100 ) may similarly be reduced. Examples of such computing resources include processor cycles, network traffic, memory usage, data storage capacity, power consumption, and cooling capacity.
  • FIG. 10 is a block diagram illustrating components of a machine 1000 , according to some example embodiments, able to (e.g., configured to) read instructions from a machine-readable medium (e.g., a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part.
  • a machine-readable medium e.g., a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof
  • FIG. 10 shows a diagrammatic representation of the machine 1000 in the example form of a computer system and within which instructions 1024 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1000 to perform any one or more of the methodologies discussed herein may be executed, in whole or in part.
  • instructions 1024 e.g., software, a program, an application, an applet, an app, or other executable code
  • the machine 1000 operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 1000 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment.
  • the machine 1000 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1024 , sequentially or otherwise, that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • the machine 1000 includes a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1004 , and a static memory 1006 , which are configured to communicate with each other via a bus 1008 .
  • the machine 1100 may further include a graphics display 1010 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)).
  • a graphics display 1010 e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
  • the machine 1000 may also include an alphanumeric input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument, such as a brain-control interface (BCI) or any other sensor capable of recording human biometrics as input for controlling a cursor), a storage unit 1016 , a signal generation device 1018 (e.g., a speaker), and a network interface device 1020 .
  • an alphanumeric input device 1012 e.g., a keyboard
  • a cursor control device 1014 e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument, such as a brain-control interface (BCI) or any other sensor capable of recording human biometrics as input for controlling a cursor
  • BCI brain-control interface
  • a signal generation device 1018 e.g
  • the storage unit 1016 includes a machine-readable medium 1022 on which is stored the instructions 1024 embodying any one or more of the methodologies or functions described herein.
  • the instructions 1024 may also reside, completely or at least partially, within the main memory 1004 , within the processor 1002 (e.g., within the processor's cache memory), or both, during execution thereof by the machine 1000 . Accordingly, the main memory 1004 and the processor 1002 may be considered as machine-readable media.
  • the instructions 1024 may be transmitted or received over a network 1026 (e.g., network 190 ) via the network interface device 1020 .
  • the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions.
  • machine-readable medium shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions for execution by a machine (e.g., machine 1000 ), such that the instructions, when executed by one or more processors of the machine (e.g., processor 1002 ), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
  • Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
  • a “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner.
  • one or more computer systems e.g., a standalone computer system, a client computer system, or a server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically, electronically, or any suitable combination thereof.
  • a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations.
  • a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC.
  • a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
  • a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • hardware module should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
  • processor-implemented module refers to a hardware module implemented using one or more processors.
  • the methods described herein may be at least partially processor-implemented, a processor being an example of hardware.
  • a processor being an example of hardware.
  • the operations of a method may be performed by one or more processors or processor-implemented modules.
  • the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS).
  • SaaS software as a service
  • at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
  • API application program interface
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • a method comprising:
  • a non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • a system comprising:
  • a non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:

Abstract

As a user is being presented with interactive media by a presenting device, a separate monitoring device may be used to monitor the presentation of the interactive media and detect an event that occurs therein. Such a monitoring device may be configured and positioned to access media content from the presentation of the interactive media. For example, the monitoring device may be configured and positioned to record video content with a camera and record audio content with a microphone. Having accessed this media content, the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.

Description

TECHNICAL FIELD
The subject matter disclosed herein generally relates to the processing of data. Specifically, the present disclosure addresses systems and methods to facilitate detecting an event within interactive media.
BACKGROUND
Media (e.g., digital media) may be presented to one or more users, for example, for purposes of entertainment, merchandising (e.g., advertising), education, public service, or other communication of information. Some media may be characterized as being “interactive” where the user to whom the media is presented can interact with the media itself (e.g., its content). That is, the content of the media depends, at least partly, on input from the user. A computer game (e.g., videogame) is an example of interactive media in which a user (e.g., game player) has at least some control over the content of the computer game (e.g., video or audio associated with achieving or failing a goal, gaining or losing an item in the game, or the appearance of the user's character or avatar within the game). Software that emulates or simulates a musical instrument (e.g., a drum machine or piano tutorial) is another example of interactive media, since the user fully or partly controls the audio content (e.g., sounds or music) produced by the software.
Generally, simply allowing a user to control a playing of media that cannot be changed by the user would not be considered “interactive.” For example, a movie may be presented to a user (e.g., movie viewer) via a player device (e.g., a portable media player) or player software (e.g., executing on a computer). Although the player device or software may allow the user to play the movie, pause the movie, skip forward within the movie, and skip backward within the movie, the movie itself (e.g., the content of the movie) is unchangeable by the viewer. Hence, playing the movie itself is not an interactive experience for the viewer.
In some situations, an interactive experience includes the playing or triggering of non-interactive media. For example, a video “cut scene” may be automatically played between levels of a computer game. In such an example, the computer game is an interactive experience, for at least the reason that the player controls when one level is completed, thereby controlling when the video “cut scene” is played, even though the video “cut scene” itself is non-interactive.
BRIEF DESCRIPTION OF THE DRAWINGS
Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings.
FIG. 1 is a network diagram illustrating a network environment suitable for detecting an event within interactive media, according to some example embodiments.
FIG. 2 is a block diagram illustrating components of a reference server configured to facilitate detection of an event within interactive media, according to some example embodiments.
FIG. 3 is a block diagram illustrating components of a device configured to detect an event within interactive media, according to some example embodiments.
FIG. 4 is a conceptual diagram illustrating detection of an event within an interactive media presentation, according to some example embodiments.
FIG. 5 is a layout diagram illustrating a notification that may be displayed with an interactive media presentation, according to some example embodiments.
FIG. 6 is a layout diagram illustrating the notification, according to some example embodiments.
FIG. 7-9 are flowcharts illustrating operations of a device in performing a method of detecting an event within interactive media, according to some example embodiments.
FIG. 10 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
DETAILED DESCRIPTION
Example methods and systems are directed to detection of one or more events within interactive media (e.g., within a presentation of interactive media). Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
A device (e.g., a computer, game console, or mobile device) may be used to present interactive media (e.g., a game, such as a videogame) to a user of the device (e.g., a player of the game). The interactive media may be stored on the device (e.g., in local memory or other storage) and presented by the device (e.g., by executing a software application, applet, or app). In other situations, the interactive media may be stored by a server (e.g., a game server) and provided to the device by the server (e.g., streamed live, downloaded portion by portion, or downloaded in full) for presentation by the device. The interactive media may include media files that each store media content (e.g., video content, image content, or audio content), and a presentation of the interactive media may be generated, presented, or both, by the device based on user input that influences or controls which media files are included in the presentation. That is, the user input may fully or partially determine whether and when a particular media file is included in the presentation.
As the user is being presented with the interactive media by a presenting device (e.g., first device), a monitoring device (e.g., second device) may be used to monitor the presentation of the interactive media and detect an event that occurs therein. Such a monitoring device may be configured and positioned to access media content from the presentation of the interactive media. For example, the monitoring device may be configured and positioned to record video content (e.g., one or more video frames, which may be still images) with a camera and record audio content with a microphone. Having accessed this media content, the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.
Accordingly, the monitoring device may present a notification that references the occurrence of the detected event. Such a notification may be presented to the user (e.g., via the monitoring device, the presenting device, or both). The notification may be presented to another user (e.g., a socially connected friend, follower, or connection of the user, as identified by or according to a social networking system). In some example embodiments, the notification is presented by the monitoring device. However, in alternative example embodiments, the monitoring device may cause the presenting device (e.g., the device that presents the interactive media) to present the notification. In some example embodiments, the presenting device and the monitoring device are combined into a single device.
Moreover, the monitoring device may function entirely independent of any server or other source that may be providing the interactive media presentation to the presentation device. That is, the monitoring device may detect an event within the interactive media presentation and present a corresponding notification without communication from such a server or other source of the interactive media presentation. Further details are described below.
FIG. 1 is a network diagram illustrating a network environment 100 suitable for detecting an event within interactive media, according to some example embodiments. The network environment 100 includes a reference server 110, a database 115, a social network server 118, an interactive media presentation server 120, and devices 130, 131, and 150, all of which may be communicatively coupled to each other via a network 190. In some example embodiments, the interactive media presentation server 120 is communicatively coupled to the device 130 by a separate network or other communication path. The reference server 110, the database 115, the social network server 118, the interactive media presentation server 120, and the devices 130, 131, and 150 may each be implemented in a computer system, in whole or in part, as described below with respect to FIG. 10. As shown in FIG. 1, the reference server 110, with or without the database 115, may form all or part of a network-based system 105. For example, the network-based system 105 may be or include a cloud-based system that provides one or more network-based services (e.g., provision of reference identifiers for media content included as part of various interactive media).
Also shown in FIG. 1 are users 132 and 152. One or both of the users 132 and 152 may be a human user (e.g., a human being), a machine user (e.g., a computer configured by a software program to interact with the device 130), or any suitable combination thereof (e.g., a human assisted by a machine or a machine supervised by a human). The user 132 is not part of the network environment 100, but is associated with the device 130 and may be a user of the device 130. For example, the device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 132.
The user 132 may also be associated with the device 131 and may be a user of the device 131. For example, the device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 132. As shown in FIG. 1, the device 131 is able to monitor (e.g., by accessing or receiving) media content presented as part of an interactive media presentation by the device 130. In certain example embodiments, the device 131 and the device 130 are combined into a single device. In such example embodiments, the monitoring of the media content may be performed internally by the single device (e.g., within memory).
Likewise, the user 152 is not part of the network environment 100, but is associated with the device 150. According to various example embodiments, the user 152 is a socially connected friend, follower, or connection of the user 132 (e.g., as identified or indicated by a social networking service, such as Facebook® or Twitter®). As an example, the device 150 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user 152.
Any of the machines, databases, or devices shown in FIG. 1 may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine, database, or device. For example, a computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. 10. As used herein, a “database” is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database (e.g., an object-relational database), a triple store, a hierarchical data store, or any suitable combination thereof. Moreover, any two or more of the machines, databases, or devices illustrated in FIG. 1 may be combined into a single machine, and the functions described herein for any single machine, database, or device may be subdivided among multiple machines, databases, or devices.
The network 190 may be any network that enables communication between or among machines, databases, and devices (e.g., the reference server 110 and the device 131). Accordingly, the network 190 may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network 190 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
FIG. 2 is a block diagram illustrating components of the reference server 110, which may be configured to facilitate detection of an event within interactive media, according to some example embodiments. The reference server 110 may be a machine that, as shown, includes a media access module 210, a fingerprint generation module 220, a watermark extraction module 230, and a notification correlation module 240, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
Any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any module described herein may configure a processor to perform the operations described herein for that module. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
The media access module 210 of the reference server 110 is configured to access media (e.g., from the database 115, from the interactive media presentation server 120, or from both). The accessed media may include one or more media files containing media content that may be presentable as part of the interactive media presentation. For example, the interactive media presentation server 120 may store such media files, and the media access module 210 of the reference server 110 may access or retrieve those media files. The database 115 may be used by the media access module 210 to temporarily or permanently store the media files (e.g., for fingerprint generation, watermark extraction, or both).
The fingerprint generation module 220 of the reference server 110 is configured to generate reference fingerprints from the media files that are accessed by the media access module 210. For example, the fingerprint generation module 220 may apply one or more algorithms to a video file and generate a reference fingerprint that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the fingerprint generation module 220 may apply one or more algorithms to an audio file and thus generate a reference fingerprint usable to identify a playing of that audio file within the interactive media presentation. As a further example, the fingerprint generation module 220 may apply one or more algorithms to an image file and accordingly generate a reference fingerprint usable to identify a displaying of that image file within the interactive media presentation. As a yet further example, the fingerprint generation module 220 may apply one or more algorithms to a text file and thereby generate a reference fingerprint that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
The watermark extraction module 230 of the reference server 110 is configured to extract reference watermarks from the media files that are accessed by the media access module 210. For example, the watermark extraction module 230 may apply one or more algorithms to a video file and extract a reference watermark that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the watermark extraction module 230 may apply one or more algorithms to an audio file and thus extract a reference watermark usable to identify a playing of that audio file within the interactive media presentation. As a further example, the watermark extraction module 230 may apply one or more algorithms to an image file and accordingly extract a reference watermark usable to identify a displaying of that image file within the interactive media presentation. As a yet further example, watermark extraction module 230 may apply one or more algorithms to a text file and thereby extract the reference watermark that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
According to various example embodiments, one or both of the fingerprint generation module 220 and the watermark extraction module 230 may be included in the reference server 110. Hence, the reference server 110 may form all or part of a cloud-based server system (e.g., of one or more machines) that is configured to generate fingerprints from various media content, store watermarks for various media content, or any suitable combination thereof.
The notification correlation module 240 of the reference server 110 is configured to correlate a media file (e.g., accessed by the media access module 210 and processed by the fingerprint generation module 220, the watermark extraction module 230, or both) with a notification that references an event which may occur within the interactive media presentation. For example, supposing the interactive media presentation is a videogame, the media file may contain video content that shows an in-game character congratulating the user (e.g., game player) on completing a difficult level of the game. In such a case, the completion of the difficult level of the game is the event that may occur within the interactive presentation, and this event, this media file, or both may be correlated with a notification that references the completion of this level of the game. The notification correlation module 240 may access event data that correlates the event with the media file (e.g., from the interactive media presentation server, from the database 115, or from both). Based on such event data, the notification correlation module 240 may map the event, the media file, or both, to the corresponding notification, which may be stored in the database 115 (e.g., after being automatically or manually generated based on the media file). This correspondence relationship (e.g., map) may be stored in the database 115.
Thus, the reference server 110, the database 115, or both, may be configured to provide (e.g., to any one or more of devices 130, 131, and 150) a reference identifier (e.g., fingerprint or watermark) of the media file and a notification that corresponds to an event signified by the media file being presented within the interactive media presentation. The reference identifier and the notification may be provided as part of a network-based service that supplements the interactive media presentation with additional information (e.g., the notification) upon detection of the event occurring. In some example embodiments, the network-based system 105 provides such a service.
Moreover, such a service may be provided without any cooperation, assistance, or other communication from the interactive media presentation server 120 or other source of the interactive media presentation. Indeed, some or all of the network-based system 105 may obtain and provide reference identifiers of various media files and the media content thereof, by accessing such media content (e.g., as a user) during a presentation of the interactive media presentation. For example, reference identifiers may be obtained from playing a computer game (e.g., to completion, automatically, by executing software scripts to simulate user input), and corresponding notifications may be generated (e.g., automatically or manually) and mapped to the obtained reference identifiers. As a result, the network-based system 105 may provide a supplemental information service that complements the interactive media presentation, but is separate from the interactive media presentation and produced independently (e.g., without collaboration with the author or source of the interactive media presentation).
FIG. 3 is a block diagram illustrating components of the device 131, which may be configured to detect an event within interactive media, according to some example embodiments. The device 131 may be a machine that, as shown, includes a reference module 310, a fingerprint module 320, a watermark module 330, a detection module 340, and a presentation module 350, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
As noted above, any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any module described herein may configure a processor to perform the operations described herein for that module. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
As noted above, the device 131 may monitor an interactive media presentation being presented by the device 130 (e.g., by recording or otherwise accessing presented media content included in the interactive media presentation). Media content being presented as part of the interactive media presentation may thus be accessed by the device 131 (e.g., for detection of the event that corresponds to the media content). For example, the device 131 may include a camera that is configured to capture video content, image content, text content, or any suitable combination thereof, that appears in the interactive media presentation. As another example, the device 131 may include a microphone is configured to capture audio content that is played within the interactive media presentation.
The reference module 310 of the device 131 is configured to access a reference identifier (e.g., reference fingerprint or reference watermark) that is generated or extracted (e.g., by the reference server 110) from media content which is presentable as part of the interactive media presentation. The reference identifier may be accessed from the network-based system 105 (e.g., from the reference server 110 or from the database 115). The reference identifier may be denoted herein as a “first identifier.” As noted above, within the interactive media presentation, an event that corresponds to the media content may be configured to occur in response to a user input (e.g., generated by the user 132 and submitted via the device 130, as the user 132 is interacting with the interactive media presentation).
The fingerprint module 320 of the device 131 is configured to generate a fingerprint from a playback of the media content as part of the interactive media presentation. For example, the fingerprint module 320 may apply one or more algorithms to a video file and generate a fingerprint that may be compared to a reference fingerprint for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the fingerprint module 320 may apply one or more algorithms to an audio file and thus generate a fingerprint that may be compared to a reference fingerprint for that audio file and thereby identify a playing of that audio file within the interactive media presentation. As a further example, the fingerprint module 320 may apply one or more algorithms to an image file and accordingly generate a fingerprint that may be compared to a reference fingerprint for that image file and thereby identify a displaying of that image file within the interactive media presentation. As a yet further example, the fingerprint module 320 may apply one or more algorithms to a text file and thereby generate a fingerprint that may be compared to a reference fingerprint for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
The watermark module 330 of the device 131 is configured to extract a watermark from the playback of the media content as part of the interactive media presentation. For example, the watermark module 330 may apply one or more algorithms to a video file and extract a watermark that may be compared to a reference watermark for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the watermark module 330 may apply one or more algorithms to an audio file and thus extract a watermark that may be compared to a reference watermark for that audio file and thereby identify a playing of that audio file within the interactive media presentation. As a further example, the watermark module 330 may apply one or more algorithms to an image file and accordingly extract a watermark that may be compared to a reference watermark for that image file and thereby identify a displaying of that image file within the interactive media presentation. As a yet further example, the watermark module 330 may apply one or more algorithms to a text file and thereby extract a watermark that may be compared to a reference watermark for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
According to various example embodiments, one or both of the fingerprint module 320 and the watermark module 330 may be included in the device 131. Hence, the device 131 may be configured to generate fingerprints from various media content monitored by the device 131, extract watermarks for such media content, or any suitable combination thereof.
The detection module 340 of the device 131 is configured to detect an occurrence of the event that corresponds to the media content. As noted above, the event may be configured to occur in response to a user input (e.g., from the user 132). The detection of this occurrence of the event may be based on the identifier generated by the device 131 matching the reference identifier accessed by the device 131. For example, the occurrence of the event may be detected based on a match between a fingerprint generated by the fingerprint module 320 and a reference fingerprint accessed by the reference module 310. As another example, the occurrence of the event may be detected based on a match between a watermark extracted by the watermark module 330 and a reference watermark accessed by the reference module 310.
The presentation module 350 of the device 131 is configured to present a notification (e.g., accessed from the network-based system 105) that references the occurrence of the event. The notification may be presented based on the detecting of the event's occurrence, based on the identifier generated by the device 131 matching the reference identifier accessed by the device 131, or based on both.
FIG. 4 is a conceptual diagram illustrating detection of an event 432 within an interactive media presentation 430, according to some example embodiments. As shown, the interactive media presentation 430 is generated, presented, or both (e.g., by the device 130) based on user input 420 and based on media 405. The user input 420 may be received from the device 130 that is presenting the interactive media presentation 430. For example, the interactive media presentation 430 may be a game (e.g., multimedia game) presented by game software that is executing on the device 130, and the user input 420 may be or include control signals, commands, or choices generated by the user 132 as part of playing the game.
As shown in FIG. 4, the media 405 may take the form of media files that contain media content 410, 412, 414, and 416. Examples of media content (e.g., media content 410) include video, an image, audio, text, or any suitable combination thereof. The arrows pointing to the interactive media presentation 430 from the user input 420 and from the media 405 indicate that the interactive media presentation 430 is influenced, at least in part, by the user input 420 and the media 405.
In the example illustrated in FIG. 4, the event 432 occurs within the interactive media presentation 430 (e.g., as a result of the user input 420). Because the event 432 is occurring, the interactive media presentation 430 includes (e.g., incorporates) the media content 410, which signifies the occurrence of the event 432. As a result, a playback of the media content 410 is initiated by the interactive media presentation 430 (e.g., via the device 130).
Since the device 131 is monitoring the interactive media presentation 430 and the playback of various media content included therein, a detection 440 of the event 432 that corresponds to the media content 410 may be performed (e.g., by the device 131). For example, the device 131 may be configured by event detection software that executes on the device 131 and performs the detection 440. According to various example embodiments, a method discussed below with respect to FIG. 7-9 may be implemented by such event detection software.
FIG. 5 is a layout diagram illustrating a notification 520 that may be displayed with (e.g., within) an interactive media presentation, according to some example embodiments. In FIG. 5, a user interface 500 is depicted in the example form of a graphical window. Such a graphical window may be displayed on a display screen of the device 130, while the device 130 is presenting the interactive media presentation 430 (e.g., within the user interface 500). In some example embodiments, the interactive media presentation 430 is a game (e.g., titled “Majestic Fantasy 4: The Unkempt Realms”), and the user interface 500 is used to present various media content of the game (e.g., media content 410).
In response to the detection 440 of the event 432 that corresponds to the media content 410, the notification 520 may be presented. FIG. 5 depicts the notification 520 being presented within the user interface 500. In alternative example embodiments, the notification 520 may be presented outside the user interface 500 (e.g., elsewhere on a display screen of the device 130, on a display screen of the device 131, or on a display screen of the device 150). The notification 520 references the event 432 (e.g., the occurrence of the event 432 within the interactive media presentation 430).
As shown in FIG. 5, some example embodiments of the user interface 500 include a help button 510 (e.g., labeled “Get Help for This Level”). In some example embodiments, the help button 510 appears in the user interface 500 in response to the detection 440 of the event 432. For example, the event 432 may be an in-game defeat of the user 132 (e.g., player), and the detection 440 of such a defeat may cause the help button 510 to appear within the user interface 500. If the user 132 clicks on the help button 510, the notification 520 appears within the user interface 500 (e.g., to provide information that may be helpful to avoid another such defeat). As another example, the event 432 may be an in-game promotion of the user 132 to a more difficult level of the game, and the detection 440 of such a promotion may trigger the appearance of the help button 510. In response to the user 132 clicking on the help button 510, the notification 520 may be shown (e.g., to provide strategy for playing the more difficult level of the game).
FIG. 6 is a layout diagram illustrating the notification 520, according to some example embodiments. As noted above, the notification references the event 432 within the interactive media presentation 430 (e.g., references the occurrence of the event 432), and according to various example embodiments, the notification 520 may be presented to one or more of the users 132 and 152 (e.g., via one or more of the devices 130, 131, and 150).
In general, the notification 520 may include any information that is pertinent to the occurrence of the event 432 within the interactive media presentation 430. As shown in FIG. 6, the notification 520 may include a map 610 (e.g., an in-game map of a player's current level within a game). The notification 520 may include one or more pieces of information 620, 630, 640, and 650, which each may reference the occurrence of the event 432. Examples of such information include a help document, a guide, an encouragement (e.g., to a player of a game, that the player persevere in attempting to win a difficult section or level of the game), a suggestion (e.g., that the player of the game attempt a different strategy), or an advertisement (e.g., that the player purchase a virtual in-game item that may enhance the player's enjoyment of the current section or level of the game). Additional examples of such information include an offer for the purchase of one or more virtual goods (e.g., as an in-game purchase, an in-app purchase, downloadable level content (DLC), or any suitable combination thereof), as well as some or all of a user interface (e.g., an electronic storefront) operable to initiate such a purchase. Further examples of such information include an offer for the purchase of one or more physical goods (e.g., related or recommended merchandise, games, memorabilia, soundtracks, or other physical items), as well as some or all of a user interface operable to initiate such a purchase. In some example embodiments, the notification 520 includes a hyperlink to such information.
As noted in FIG. 6, some or all of the information 620, 630, 640, or 650 may refer to an achievement that is signified by the event 432 occurring within the interactive media presentation 430 (e.g., completion of a level in a game, as signified by a video “cut scene” that appears at the end of the level). Any of the information 620, 630, 640, or 650 may describe a virtual item in a virtual world (e.g., an upgraded sword within a fantasy adventure game, as signified by special music that plays upon acquisition of the upgraded sword).
Some or all of the information 620, 630, 640, or 650 may be included in the notification 520 based on a level of progress within a storyline of the interactive media presentation (e.g., a plot of a game). Similarly, any of the information 620, 630, 640, or 650 may be included based on a level of advancement attained by the user 132 (e.g., within a character arc of an in-game character or avatar of the user 132). Accordingly, any of the information 620, 630, 640, or 650 may refer to such a level of progress or level of advancement.
In some example embodiments, some or all of the information 620, 630, 640, or 650 corresponds to a virtual location, virtual orientation, or both, within a virtual world. For example, suppose the interactive media presentation 430 is an immersive game within a three-dimensional virtual world. The event 432 may be an arrival of the player at a particular location within the virtual world (e.g., a waterfall that hides a treasure chest), and the event 432 may be detected by the corresponding playback of the media content 410 (e.g., a particular audio pattern of water splashing sounds, a particular visual pattern of rock formations, or both). Based on this, any of the information 620, 630, 640, or 650 may reference that particular location (e.g., the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Look behind the waterfall to find treasure!” or “If you want to search behind the waterfall, would you like to buy some goggles or an umbrella?”).
In certain situations, the media content 410 may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world. Accordingly, the event 432 may be detected by a corresponding playback of the media content 410 with a particular virtual orientation (e.g., a particular audio pattern of water splashing sounds whose frequency distribution indicates that the player is facing towards the waterfall that hides the treasure chest, a particular visual pattern of rock formations indicating that the player is facing the waterfall, or both). Based on this, any of the information 620, 630, 640, or 650 may reference that particular orientation (e.g., facing towards the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Don't get distracted by the waterfall! Enemies may be lurking behind you!” or “If you want to search behind the waterfall, would you like to hire a helper to watch your back?”).
In example embodiments where the notification 520 is presented to the user 152 via the device 150, the notification 520 may include a reference 660 to the user 132. As noted above, the users 132 and 152 may be socially connected to each other (e.g., as friends, followers, or connections, such as may be indicated by a social networking service). Accordingly, the notification 520 with the reference 660 may notify the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430. For example, the notification 520 may thus tell the user 152 that the user 132 has been defeated in playing a game, has been promoted to a more difficult level of the game, has acquired a virtual object within the game, has advanced to a particular point in the game's storyline, or has arrived at a particular virtual location within a virtual world in which the game is played.
FIG. 7-9 are flowcharts illustrating operations of the device 131 in performing a method 700 of detecting the event 432 within the interactive media presentation 430, according to some example embodiments. Operations in the method 700 may be performed by the device 131 (e.g., separately from the device 130, or sharing one or more operations with the device 130), using modules described above with respect to FIG. 3. As shown in FIG. 7, the method 700 includes operation 710, 720, 730, and 740.
In operation 710, the reference module 310 of the device 131 accesses a first identifier (e.g., a first fingerprint or a first watermark). The first identifier may be a reference identifier and may be accessed from the database 115, and the first identifier may be obtained from the media content 410, which is presentable as part of the interactive media presentation 430. In some example embodiments, the reference module 310 accesses a first fingerprint generated from the media content 410 (e.g., by the fingerprint generation module 220 of the reference server 110). In certain example embodiments, the reference module 310 may access a first watermark extracted from the media content 410 (e.g., by the watermark extraction module 230 of the reference server 110).
In operation 720, according to some example embodiments, the fingerprint module 320 of the device 131 generates a second identifier (e.g., a second fingerprint) from a playback of the media content 410 as part of the interactive media presentation 430. The second identifier may be called a generated identifier. In certain example embodiments, the watermark module 330 of the device 131 extracts the second identifier (e.g., a second watermark) from the playback of the media content 410 as part of the interactive media presentation 430.
In operation 730, the detection module 340 of the device 131 detects an occurrence of the event 432 within the interactive media presentation 430. This detection may be based on the second identifier (e.g., generated identifier) matching the first identifier (e.g., reference identifier). For example, the detection module 340 may detect the occurrence of the event 432 by comparing an accessed first fingerprint (e.g., reference fingerprint) of the media content 410 to a generated second fingerprint (e.g., generated fingerprint) of the media content 410. Based on the accessed first fingerprint matching the generated second fingerprint, the detection module 340 may detect the occurrence of the event 432. As another example, the detection module 340 may detect the occurrence of the event 432 by comparing an accessed first watermark (e.g., reference watermark) of the media content 410 to an extracted second watermark (e.g., extracted watermark) of the media content 410. Based on the accessed first watermark matching the extracted second watermark, the detection module 340 may detect the occurrence of the event 432.
In operation 740, the presentation module 350 of the device 131 presents the notification 520, which may reference the occurrence of the event 432 within the interactive media presentation 430. As noted above, the event 432, the occurrence thereof, or both, may be detected based on the second identifier being determined to match the first identifier (e.g., in operation 730). For example, the presentation module 350 may present the notification 520 on a display screen of the device 131 (e.g., within an alert or other message). Accordingly, the notification 520 may be presented to the user 132 on the device 131, which may be functioning as a supplementary device that monitors the interactive media presentation 430 and provides notifications (e.g., notification 520) in response to detected events (e.g., event 432) therein.
In some example embodiments, the presentation module 250 of the device 131 causes the device 130 to present the notification 520 (e.g., within the user interface 500, in which the interactive media presentation 430 may be presented). Accordingly, the notification 520 may be presented to the user 132 on the device 130, which may be both presenting the interactive media presentation 430 and providing notifications (e.g., notification 520) in response to detected events (e.g., event 432) therein.
In certain example embodiments, the presentation module 350 causes the device 150 to present the notification 520 (e.g., within an alert or other message). For example, the notification 520 may be presented with the reference 660 (e.g., to the user 132). Accordingly, the notification 520 may be presented to the user 152 on the device 150, which may notify the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430, as indicated by detection of the playback of the media content 410 as part of the interactive media presentation 430.
As shown in FIG. 8, the method 700 may include one or more of operations 802, 810, 820, 826, and 828. Operation 802 may be performed prior to operation 710, and operation 710 may be performed in response to operation 802. In operation 802, the detection module 340 of the device 131 receives a request that a current portion of the interactive media presentation 430 be identified. For example, the detection module 340 may detect that the help button 510 (e.g., labeled “Get Help For This Level”) has been activated (e.g., clicked or touched), where activation of the help button 510 initiates such a request to identify a current portion (e.g., a current level of a game) of the interactive media presentation 430. In example embodiments that include operation 802, one or more of operations 710, 720, and 740 may be performed based on the received request. Moreover, the presenting of the notification 520 in operation 740 may identify the current portion of the interactive media presentation 430 (e.g., the current level of the game). For example, the current portion of the interactive media presentation 430 may be named in the notification 520 (e.g., within information 620) or shown in the notification 520 (e.g., within the map 610).
In some example embodiments, the media content 410 is unique within the interactive media presentation 430, and the event 432 corresponds exclusively to the media content 410. In such example embodiments, occurrence of the event 432 is always accompanied by a playback of the media content 410, and a playback of the media content 410 always signifies the occurrence of the event 432. For example, the media content 410 may be a video (e.g., a “cut scene”) that is played only between Level 3 and Level 4 (e.g., upon completion of Level 3) in a multi-level computer game. As another example, the media content 410 may be a sound (e.g., a trumpet fanfare) that is played only whenever a player is promoted to a higher rank in a military simulation game. In such example embodiments, the detecting of the occurrence of the event 432 in operation 730 may be performed based simply on the second identifier (e.g., generated fingerprint) matching the first identifier (e.g., reference fingerprint).
In alternative example embodiments, however, the media content 410 is not unique within the interactive media presentation 430. The event 432 may correspond nonexclusively to the media content 410. In such example embodiments, a matching of the second identifier to the first identifier in operation 730 may be insufficient to detect the event 432. Accordingly, one or more additional comparisons of generated identifiers to reference identifiers may be used by the device 131 to detect the event 432.
In operation 810, the reference module 310 of the device 131 accesses a third identifier in a manner similar to that described above with respect to operation 710. The third identifier may be a further reference identifier and may be accessed from the database 115, and the third identifier may be obtained from the media content 412, which is presentable as part of the interactive media presentation 430. In some example embodiments, the reference module 310 accesses a third fingerprint generated from the media content 412 (e.g., by the fingerprint generation module 220 of the reference server 110). In certain example embodiments, the reference module 310 may access a third watermark extracted from the media content 412 (e.g., by the watermark extraction module 230 of the reference server 110).
In operation 820, according to some example embodiments, the fingerprint module 320 of the device 131 generates a fourth identifier (e.g., a fourth fingerprint) from a playback of the media content 412 as part of the interactive media presentation 430. The fourth identifier may be called a further generated identifier. In certain example embodiments, the watermark module 330 of the device 131 extracts the fourth identifier (e.g., a fourth watermark) from the playback of the media content 412 as part of the interactive media presentation 430.
In example embodiments that include operations 810 and 820, the detecting of the occurrence of the event 432 in operation 730 may be based on the third identifier (e.g., further reference identifier) matching the fourth identifier (e.g., further generated identifier). For example, the detection module 340 may detect the occurrence of the event 432 by comparing accessed first and third fingerprints (e.g., reference fingerprints) to generated second and fourth fingerprints (e.g., generated fingerprints). Based on the accessed first fingerprint matching the generated second fingerprint and the accessed third fingerprint matching the generated fourth fingerprint, the detection module 340 may detect the occurrence of the event 432. As another example, the detection module 340 may detect the occurrence of the event 432 by comparing accessed first and third watermarks (e.g., reference watermarks) to extracted second and fourth watermarks (e.g., extracted watermarks). Based on the accessed first watermark matching the extracted second watermark and the third watermark matching the fourth watermark, the detection module 340 may detect the occurrence of the event 432.
In some example embodiments, the media content 410 is not unique within the interactive media presentation 430, but the event 432 corresponds exclusively to the playback of the media content 410 being contemporaneous (e.g., within a five-second window) with the playback of the media content 412 (e.g., further media content). In such example embodiments, the detecting of the occurrence of the event 432 in operation 730 may be based on the event 432 corresponding exclusively to the contemporaneous playback of the media content 410 with the media content 412. In other words, the event 432 may be detected based on the event 432 corresponding exclusively to the fact that the playback of the media content 410 is contemporaneous with the playback of the media content 412.
In certain example embodiments, the media content 410 is not unique within the interactive media presentation 430, but the event 432 corresponds nonexclusively to the playback of the media content 410 being contemporaneous (e.g., within a two-second window) with the playback of the media content 412 (e.g., further media content). In such example embodiments, the detecting of the occurrence of the event 432 in operation 730 may be based on a probability that the playback of the media content 410 is contemporaneous with the playback of the media content 412. In operation 826, the detection module 340 of the device 131 accesses such a probability (e.g., from the network-based system 105 or any portion thereof). For example, the detection module 340 may access a 90% probability that the event 432 is accompanied by a contemporaneous playback of the media content 410 with the media content 412. Accordingly, the detection module 340 may perform operation 730 based on the accessed probability.
Alternatively, in such example embodiments, the detecting of the occurrence of the event 432 in operation 730 may be based on a history of events that occurred within the interactive media presentation 430 prior to the playback of the media content 412 within the interactive media presentation 430. In operation 828, the detection module 340 of the device 131 accesses such a history of detected events (e.g., from the network-based system 105 or any portion thereof). For example, the detection module 340 may access a log of events that have been previously detected by the device 131 while monitoring the interactive media presentation 430, and the log of events may indicate that other events (e.g., aside from the event 432) signified by a playback of the media content 410, the media content 412, or both, have already been detected as already having occurred. Thus, the other events that have already occurred may be eliminated as potential candidates for detection in operation 730. Accordingly, the detection module 340 may perform operation 730 based on the accessed history of detected events.
As shown in FIG. 9, the method 700 may include one or more of operations 840, 841, 842, 843, 844, 845, 846, and 850. One or more of operations 840-846 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 740, in which the presentation module 350 of the device 131 presents the notification 520.
In some example embodiments, the event 432 includes (e.g., indicates or signifies) an achievement within a game by a player of the game (e.g., an in-game achievement by the user 132). Examples of such an achievement include completing a level of the game, performing a particular set of tasks within the game, gaining access to a feature of the game (e.g., discovering or unlocking hidden content or “Easter eggs”), or any suitable combination thereof. In operation 840, the presentation module 350 presents information 630, which may reference the achievement by the player. For example, the information 630 may include a congratulatory message that mentions the achievement. As another example, the information 630 may include a suggestion that the user 132 (e.g., player) share the achievement with a socially connected friend (e.g., by sending a message that references the achievement).
In certain example embodiments, the event 432 includes (e.g., indicates or signifies) an acquisition of a virtual item within a virtual world. Examples of such an acquisition include obtaining a significant talisman or weapon (e.g., within a fantasy adventure game that is set within a three-dimensional virtual world), gaining access to an upgrade to an existing virtual item (e.g., a faster car in a racing game), receiving a large sum of virtual money (e.g., treasure or prizes), or any suitable combination thereof. In operation 841, the presentation module 350 presents information 630, which may reference the acquisition of the virtual item. For example, the information 630 may include a congratulatory message that mentions the acquisition. As another example, the information 630 may include a suggestion that the user 132 (e.g., player) shares news of the acquisition with a socially connected friend (e.g., by sending a message that references the acquisition).
In some example embodiments, the event 432 includes (e.g., indicates or signifies) a level of progress within a storyline of a game. For example, the media content 410 may include video content, audio content, or both, that indicates the level of progress (e.g., special music that is specific to a particular section of the storyline). In operation 842, the presentation module 350 presents information 640, which may be based on, and may reference, the level of progress within the storyline. For example, the information 640 may include a summary of the storyline up to the current level of progress, a preview of the next level of progress in the storyline, advice, tips, suggestions, encouragements, or any suitable combination thereof.
In certain example embodiments, the event 432 includes (e.g., indicates or signifies) a level of advancement within the game by a player of the game (e.g., a new in-game rank attained by the user 132). For example, the media content 410 may include video content, audio content, or both, that indicates the level of advancement (e.g., special insignia presented on the screen or special sound effects). In operation 842, the presentation module 350 presents information 640, which may be based on, and may reference, the level of advancement within the game. For example, the information 640 may include a description of a rank to which the player has been promoted, a description of new abilities or powers accorded to the level of advancement, an indication of progress toward the next level of advancement, indication of effort (e.g., measured in time, actions, or both) expended in reaching the level of advancement, or any suitable combination thereof.
In some example embodiments, the media content 410 includes content (e.g., audio content or video content) that indicates a virtual location within a virtual world. As noted above, the notification 520 may include information (e.g., information 650) that corresponds to a virtual location, virtual orientation, or both, within a virtual world. Accordingly, in operation 843, the presentation module 350 of the device 131 presents information (e.g., information 650) that corresponds to (e.g., describes or references) the virtual location within the virtual world.
Moreover, as noted above, the media content 410 may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world. Hence, in operation 844, the presentation module 350 presents information (e.g., information 650) that corresponds to the virtual orientation (e.g., at the virtual location) within the virtual world.
According to various example embodiments, as noted above, the notification 520 may include information (e.g., information 620) that contains a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof. Accordingly, in operation 845, the presentation module 350 of the device 151 presents a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof, in performance of operation 740. In some example embodiments, the media content 410 indicates a failure to achieve a goal within the interactive media presentation 430 (e.g., a failure by a player of a videogame to achieve a goal within the game). In such example embodiments, operation 845 may involve presenting a suggestion on achieving the goal (e.g., in the next attempt or some future attempt), an encouragement to the player (e.g., to the user 132, that the user 132 try again to achieve a goal), or an advertisement for a virtual item within the interactive media presentation 430 (e.g., a purchasable virtual item that may facilitate achieving the goal), or any suitable combination thereof.
In operation 846, the presentation module 350 of the device 131 communicates the notification 520 with the reference 660, which may describe the user 132. As noted above, the notification 520 may be communicated to the user 152 (e.g., via the device 150), who may be socially connected (e.g., as a friend, follower, or connection) to the user 132 via one or more social networking services. This may have the effect of notifying the user 152 that the user 132 has experienced the event 432 within the interactive media presentation 430.
Operation 850 may be performed after operation 730, in which the detection module 340 of the device 131 detects the occurrence of the event 432. As shown in FIG. 9, operation 850 may follow operation 740, in which the presentation module 350 of the device 131 presents the notification 520. In operation 850, the detection module 340 stores a reference to the occurrence of the event 432. The reference may be stored in a data record (e.g., within the database 115) that corresponds to the user 132 (e.g., a player of a videogame) from whom the user input 420 may be received. As noted above, the user input 420 may be a basis (e.g., an influence, a factor, or a control signal) for the interactive media presentation 430. Accordingly, the stored reference may be usable to identify a portion (e.g., section, chapter, level, or part) of the interactive media presentation 430 that contains the event 432. That is, the stored reference may indicate, designate, or define the portion within which the event 432 is configured to occur (e.g., in response to the user input 420) within the interactive media presentation 430. This may have the effect making annotation that the user 132 has been presented with this portion of the interactive media presentation 430. In example embodiments where the interactive media presentation 430 is a game, operation 850 results in the detection of the event 432 within the interactive media presentation 430 being recorded (e.g., in the database 115) in a game history of the user 132.
According to various example embodiments, one or more of the methodologies described herein may facilitate detection of an event within a presentation of interactive media. Moreover, one or more of the methodologies described herein may facilitate presentation of a notification that references an event within such interactive media. Hence, one or more the methodologies described herein may facilitate provision of a supplementary information service that complements interactive media, independently, with or without communication from a source of the interactive media.
When these effects are considered in aggregate, one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in detecting events within interactive media presentations. Efforts expended by a user in identifying a current portion of an interactive media presentation may be reduced by one or more of the methodologies described herein. Computing resources used by one or more machines, databases, or devices (e.g., within the network environment 100) may similarly be reduced. Examples of such computing resources include processor cycles, network traffic, memory usage, data storage capacity, power consumption, and cooling capacity.
FIG. 10 is a block diagram illustrating components of a machine 1000, according to some example embodiments, able to (e.g., configured to) read instructions from a machine-readable medium (e.g., a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part. Specifically, FIG. 10 shows a diagrammatic representation of the machine 1000 in the example form of a computer system and within which instructions 1024 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1000 to perform any one or more of the methodologies discussed herein may be executed, in whole or in part. In alternative embodiments, the machine 1000 operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment. The machine 1000 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1024, sequentially or otherwise, that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions 1024 to perform all or part of any one or more of the methodologies discussed herein.
The machine 1000 includes a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1004, and a static memory 1006, which are configured to communicate with each other via a bus 1008. The machine 1100 may further include a graphics display 1010 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)). The machine 1000 may also include an alphanumeric input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument, such as a brain-control interface (BCI) or any other sensor capable of recording human biometrics as input for controlling a cursor), a storage unit 1016, a signal generation device 1018 (e.g., a speaker), and a network interface device 1020.
The storage unit 1016 includes a machine-readable medium 1022 on which is stored the instructions 1024 embodying any one or more of the methodologies or functions described herein. The instructions 1024 may also reside, completely or at least partially, within the main memory 1004, within the processor 1002 (e.g., within the processor's cache memory), or both, during execution thereof by the machine 1000. Accordingly, the main memory 1004 and the processor 1002 may be considered as machine-readable media. The instructions 1024 may be transmitted or received over a network 1026 (e.g., network 190) via the network interface device 1020.
As used herein, the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions for execution by a machine (e.g., machine 1000), such that the instructions, when executed by one or more processors of the machine (e.g., processor 1002), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
Similarly, the methods described herein may be at least partially processor-implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of the subject matter discussed herein may be presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). Such algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms “a” or “an” are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.
The following enumerated descriptions define various example embodiments of methods, machine-readable media, and systems (e.g., apparatus) discussed herein:
1. A method comprising:
  • accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
  • generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
  • detecting an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint,
  • the detecting of the occurrence of the event being performed by a processor of a machine; and
  • presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
2. The method of description 1, wherein:
  • the media content is unique within the interactive media presentation; and
  • the event corresponds exclusively to the media content.
3. The method of description 1, wherein:
  • the media content is not unique within the interactive media presentation;
  • the event corresponds nonexclusively to the media content; and
  • the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
4. The method of description 3, wherein:
  • the event corresponds exclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
  • the detecting of the occurrence of the event is based on the event corresponding exclusively to the playback of the media content being contemporaneous with the playback of the further media content.
5. The method of description 3, wherein:
  • the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
  • the detecting of the occurrence of the event is based on a probability that the playback of the media content is contemporaneous with the playback of the further media content.
6. The method of description 3 or description 5, wherein:
  • the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
  • the detecting of the occurrence of the event is based on a history of events that occurred within the interactive media presentation prior to the playback of the further media content.
7. The method of any of descriptions 1-6, wherein:
  • the generating of the second fingerprint is in response to a request that a current portion of the interactive media presentation be identified;
  • the presenting of the notification identifies the current portion of the interactive media presentation; and the method further comprises
  • receiving the request that the current portion of the interactive media presentation be identified.
8. The method of any of descriptions 1-7, wherein:
  • the event includes an achievement within a game by a player of the game; and
  • the presenting of the notification presents information that references the achievement by the player.
9. The method of any of descriptions 1-8, wherein:
  • the event includes acquisition of a virtual item within a virtual world; and
  • the presenting of the notification presents information that describes the virtual item.
10. The method of any of descriptions 1-9, wherein:
  • the media content includes video content that indicates a level of progress within a storyline of a game; and
  • the presenting of the notification presents information based on the level of progress within the storyline of the game.
11. The method of any of descriptions 1-10, wherein:
  • the media content includes audio content that indicates a level of advancement within a game by a player of the game; and
  • the presenting of the notification presents information based on the level of advancement within the game by the player.
12. The method of any of descriptions 1-11, wherein:
  • the media content includes audio content that indicates a virtual location within a virtual world; and
  • the presenting of the notification presents information that corresponds to the virtual location within the virtual world.
13. The method of any of descriptions 1-12, wherein:
  • the media content includes multi-channel audio content that indicates a virtual orientation within a virtual world; and
  • the presenting of the notification presents information that corresponds to the virtual orientation within the virtual world.
14. The method of any of descriptions 1-13, wherein:
  • the media content indicates a failure to achieve a goal within a game by a player of the game; and
  • the presenting of the notification presents at least one of a suggestion on achieving the goal, an encouragement to the player, or an advertisement for a virtual item within the game.
15. The method of any of descriptions 1-14, wherein:
  • a first user that submitted the user input is socially connected to a second user according to a social network; and
  • the presenting of the notification includes communicating the notification with a reference to the first user to a device of the second user.
16. The method of any of descriptions 1-15 further comprising:
  • storing a reference to the occurrence of the event in a data record that corresponds to a user that submitted the user input,
  • the stored reference being usable to identify the part of the interactive media presentation within which the event is configured to occur.
17. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
  • generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
  • detecting an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint,
  • the detecting of the occurrence of the event being performed by the one or more processors of the machine; and
  • presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
18. The non-transitory machine-readable storage medium of description 17, wherein:
  • the media content is not unique within the interactive media presentation;
  • the event corresponds nonexclusively to the media content; and
  • the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
19. A system comprising:
  • a reference module configured to access a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
  • a fingerprint module configured to generate a second fingerprint from a playback of the media content as part of the interactive media presentation;
  • a processor configured by a detection module to detect an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint; and
  • a presentation module configured to present a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
20. The system of description 19, wherein:
  • the media content is not unique within the interactive media presentation;
  • the event corresponds nonexclusively to the media content; and
  • the detection module configures the processor to detect the occurrence of the event based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
21. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • accessing a first watermark embedded within media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
  • detecting a second watermark within a playback of the media content as part of the interactive media presentation;
  • detecting an occurrence of the event within the interactive media presentation based on the second watermark matching the first watermark,
  • the detecting of the occurrence of the event being performed by the one or more processors of the machine; and
  • presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second watermark matching the first watermark.

Claims (20)

What is claimed is:
1. A method comprising:
accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input, the media content including spatialized multi-channel audio content that indicates a virtual directional orientation towards which a player avatar is facing within a virtual world;
generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
detecting, using at least one processor of a machine, an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint; and
presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint and includes information that corresponds to the virtual directional orientation within the virtual world.
2. The method of claim 1, wherein:
the media content is unique within the interactive media presentation; and
the event corresponds exclusively to the media content.
3. The method of claim 1, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
4. The method of claim 3, wherein:
the event corresponds exclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on the event corresponding exclusively to the playback of the media content being contemporaneous with the playback of the further media content.
5. The method of claim 3, wherein:
the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on a probability that the playback of the media content is contemporaneous with the playback of the further media content.
6. The method of claim 3, wherein:
the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on a history of events that occurred within the interactive media presentation prior to the playback of the further media content.
7. The method of claim 1, wherein:
the generating of the second fingerprint is in response to a request that a current portion of the interactive media presentation be identified;
the presenting of the notification identifies the current portion of the interactive media presentation; and the method further comprises
receiving the request that the current portion of the interactive media presentation be identified.
8. The method of claim 1, wherein:
the event includes an achievement within a game by a player of the game; and
the presenting of the notification presents information that references the achievement by the player.
9. The method of claim 1, wherein:
the event includes acquisition of a virtual item within a virtual world; and
the presenting of the notification presents information that describes the virtual item.
10. The method of claim 1, wherein:
the media content includes video content that indicates a level of progress within a storyline of a game; and
the presenting of the notification presents information based on the level of progress within the storyline of the game.
11. The method of claim 1, wherein:
the media content includes audio content that indicates a level of advancement within a game by a player of the game; and
the presenting of the notification presents information based on the level of advancement within the game by the player.
12. The method of claim 1, wherein:
the media content includes audio content that indicates a virtual location within a virtual world; and
the presenting of the notification presents information that corresponds to the virtual location within the virtual world.
13. The method of claim 1, wherein:
the media content indicates a failure to achieve a goal within a game by a player of the game; and
the presenting of the notification presents at least one of a suggestion on achieving the goal, an encouragement to the player, or an advertisement for a virtual item within the game.
14. The method of claim 1, wherein:
a first user that submitted the user input is socially connected to a second user according to a social network; and
the presenting of the notification includes communicating the notification with a reference to the first user to a device of the second user.
15. The method of claim 1 further comprising:
storing a reference to the occurrence of the event in a data record that corresponds to a user that submitted the user input, the stored reference being usable to identify the part of the interactive media presentation within which the event is configured to occur.
16. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input, the media content including spatialized multi-channel audio content that indicates a virtual directional orientation towards which a player avatar is facing within a virtual world;
generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
detecting, using at least one of the processors of the machine, an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint; and
presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint and includes information that corresponds to the virtual directional orientation within the virtual world.
17. The non-transitory machine-readable storage medium of claim 16, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
18. A system comprising:
a reference module configured to access a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input, the media content including spatialized multi-channel audio content that indicates a virtual directional orientation towards which a player avatar is facing within a virtual world;
a fingerprint module configured to generate a second fingerprint from a playback of the media content as part of the interactive media presentation;
a processor configured by a detection module to detect an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint; and
a presentation module configured to present a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint and includes information that corresponds to the virtual directional orientation within the virtual world.
19. The system of claim 18, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detection module configures the processor to detect the occurrence of the event based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
20. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a first watermark embedded within media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input, the media content including spatialized multi-channel audio content that indicates a virtual directional orientation towards which a player avatar is facing within a virtual world;
detecting a second watermark within a playback of the media content as part of the interactive media presentation;
detecting an occurrence of the event within the interactive media presentation based on the second watermark matching the first watermark, the detecting of the occurrence of the event being performed by the one or more processors of the machine; and
presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second watermark matching the first watermark and includes information that corresponds to the virtual directional orientation within the virtual world.
US13/795,877 2013-03-12 2013-03-12 Detecting an event within interactive media including spatialized multi-channel audio content Active 2034-08-09 US9372531B2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US13/795,877 US9372531B2 (en) 2013-03-12 2013-03-12 Detecting an event within interactive media including spatialized multi-channel audio content
US15/003,658 US10055010B2 (en) 2013-03-12 2016-01-21 Detecting an event within interactive media including spatialized multi-channel audio content
US16/017,170 US10156894B2 (en) 2013-03-12 2018-06-25 Detecting an event within interactive media
US16/168,412 US10345892B2 (en) 2013-03-12 2018-10-23 Detecting and responding to an event within an interactive videogame
US16/425,490 US10824222B2 (en) 2013-03-12 2019-05-29 Detecting and responding to an event within an interactive videogame
US16/947,969 US11068042B2 (en) 2013-03-12 2020-08-26 Detecting and responding to an event within an interactive videogame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/795,877 US9372531B2 (en) 2013-03-12 2013-03-12 Detecting an event within interactive media including spatialized multi-channel audio content

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/003,658 Continuation US10055010B2 (en) 2013-03-12 2016-01-21 Detecting an event within interactive media including spatialized multi-channel audio content

Publications (2)

Publication Number Publication Date
US20140274353A1 US20140274353A1 (en) 2014-09-18
US9372531B2 true US9372531B2 (en) 2016-06-21

Family

ID=51529540

Family Applications (6)

Application Number Title Priority Date Filing Date
US13/795,877 Active 2034-08-09 US9372531B2 (en) 2013-03-12 2013-03-12 Detecting an event within interactive media including spatialized multi-channel audio content
US15/003,658 Active 2034-04-07 US10055010B2 (en) 2013-03-12 2016-01-21 Detecting an event within interactive media including spatialized multi-channel audio content
US16/017,170 Active US10156894B2 (en) 2013-03-12 2018-06-25 Detecting an event within interactive media
US16/168,412 Active US10345892B2 (en) 2013-03-12 2018-10-23 Detecting and responding to an event within an interactive videogame
US16/425,490 Active US10824222B2 (en) 2013-03-12 2019-05-29 Detecting and responding to an event within an interactive videogame
US16/947,969 Active US11068042B2 (en) 2013-03-12 2020-08-26 Detecting and responding to an event within an interactive videogame

Family Applications After (5)

Application Number Title Priority Date Filing Date
US15/003,658 Active 2034-04-07 US10055010B2 (en) 2013-03-12 2016-01-21 Detecting an event within interactive media including spatialized multi-channel audio content
US16/017,170 Active US10156894B2 (en) 2013-03-12 2018-06-25 Detecting an event within interactive media
US16/168,412 Active US10345892B2 (en) 2013-03-12 2018-10-23 Detecting and responding to an event within an interactive videogame
US16/425,490 Active US10824222B2 (en) 2013-03-12 2019-05-29 Detecting and responding to an event within an interactive videogame
US16/947,969 Active US11068042B2 (en) 2013-03-12 2020-08-26 Detecting and responding to an event within an interactive videogame

Country Status (1)

Country Link
US (6) US9372531B2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9686596B2 (en) 2008-11-26 2017-06-20 Free Stream Media Corp. Advertisement targeting through embedded scripts in supply-side and demand-side platforms
US9703947B2 (en) 2008-11-26 2017-07-11 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9716736B2 (en) 2008-11-26 2017-07-25 Free Stream Media Corp. System and method of discovery and launch associated with a networked media device
US20180012610A1 (en) * 2013-06-19 2018-01-11 Dolby Laboratories Licensing Corporation Audio encoder and decoder with dynamic range compression metadata
US9961388B2 (en) 2008-11-26 2018-05-01 David Harrison Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements
US9986279B2 (en) 2008-11-26 2018-05-29 Free Stream Media Corp. Discovery, access control, and communication with networked services
US10334324B2 (en) 2008-11-26 2019-06-25 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US10419541B2 (en) 2008-11-26 2019-09-17 Free Stream Media Corp. Remotely control devices over a network without authentication or registration
US10567823B2 (en) 2008-11-26 2020-02-18 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US10631068B2 (en) 2008-11-26 2020-04-21 Free Stream Media Corp. Content exposure attribution based on renderings of related content across multiple devices
US10880340B2 (en) 2008-11-26 2020-12-29 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US10956121B2 (en) 2013-09-12 2021-03-23 Dolby Laboratories Licensing Corporation Dynamic range control for a wide variety of playback environments
US10972204B2 (en) 2017-06-12 2021-04-06 Gracenote, Inc. Detecting and responding to rendering of interactive video content
US10977693B2 (en) 2008-11-26 2021-04-13 Free Stream Media Corp. Association of content identifier of audio-visual data with additional data through capture infrastructure
US11134279B1 (en) 2017-07-27 2021-09-28 Amazon Technologies, Inc. Validation of media using fingerprinting
US11234060B2 (en) 2017-09-01 2022-01-25 Roku, Inc. Weave streaming content into a linear viewing experience
US11418858B2 (en) 2017-09-01 2022-08-16 Roku, Inc. Interactive content when the secondary content is server stitched
US11936467B2 (en) 2021-02-26 2024-03-19 Roku, Inc. Detecting and responding to rendering of interactive video content

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9372531B2 (en) 2013-03-12 2016-06-21 Gracenote, Inc. Detecting an event within interactive media including spatialized multi-channel audio content
US10965991B2 (en) 2013-09-06 2021-03-30 Gracenote, Inc. Displaying an actionable element over playing content
CN103616998B (en) * 2013-11-15 2018-04-06 北京智谷睿拓技术服务有限公司 User information acquiring method and user profile acquisition device
CN103678971B (en) * 2013-11-15 2019-05-07 北京智谷睿拓技术服务有限公司 User information extracting method and user information extraction element
CN103646656B (en) * 2013-11-29 2016-05-04 腾讯科技(成都)有限公司 Sound effect treatment method, device, plugin manager and audio plug-in unit
US20170050108A1 (en) * 2014-03-06 2017-02-23 MNET Mobile Pty Ltd. Method of Synchronising Human Activity That Includes Use of a Portable Computer Device With Audio Output From a Primary Device
US9393486B2 (en) * 2014-06-27 2016-07-19 Amazon Technologies, Inc. Character simulation and playback notification in game session replay
US20170128836A1 (en) * 2015-11-11 2017-05-11 Rovio Entertainment Ltd. Game content unlock method
CA3043863A1 (en) * 2016-03-21 2017-09-28 Liveramp, Inc. Data watermarking and fingerprinting system and method
US10764646B2 (en) 2016-08-25 2020-09-01 Through The Lens Entertainment Corporation System and method for managing interactive media
US10279260B2 (en) 2017-03-06 2019-05-07 Sony Interactive Entertainment LLC Cut-scene gameplay
US10547658B2 (en) * 2017-03-23 2020-01-28 Cognant Llc System and method for managing content presentation on client devices
US10709989B2 (en) * 2018-01-11 2020-07-14 Electronics And Telecommunications Research Institute System and method for analyzing game update effect according to change of gamer action sequence
US10661173B2 (en) * 2018-06-26 2020-05-26 Sony Interactive Entertainment Inc. Systems and methods to provide audible output based on section of content being presented
US11325044B2 (en) * 2019-03-07 2022-05-10 Sony Interactive Entertainment LLC Video game guidance system
US11006191B2 (en) * 2019-08-02 2021-05-11 The Nielsen Company (Us), Llc Use of watermarking to control abandonment of dynamic content modification
CN111787341B (en) * 2020-05-29 2023-12-05 北京京东尚科信息技术有限公司 Guide broadcasting method, device and system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053814A (en) * 1997-12-04 2000-04-25 Logitech, Inc. System and method for automatically adjusting game controller sensitivity to player inputs
US20020161586A1 (en) * 1998-10-15 2002-10-31 Jong-Ding Wang Voice control module for controlling a game controller
US20020161462A1 (en) * 2001-03-05 2002-10-31 Fay Todor J. Scripting solution for interactive audio generation
US20060233389A1 (en) * 2003-08-27 2006-10-19 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US20070155494A1 (en) * 2004-08-25 2007-07-05 Wells Robert V Video game system and method
US20080004115A1 (en) * 2003-05-20 2008-01-03 Sony Computer Entertainment America Inc. Video game method and system with content-related options
US20080009332A1 (en) * 2006-07-04 2008-01-10 Sony Computer Entertainment Inc. User interface apparatus and operational sensitivity adjusting method
US20090176569A1 (en) * 2006-07-07 2009-07-09 Ambx Uk Limited Ambient environment effects
US20100027837A1 (en) * 1995-05-08 2010-02-04 Levy Kenneth L Extracting Multiple Identifiers from Audio and Video Content
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US7976385B2 (en) * 2004-05-11 2011-07-12 Mattel, Inc. Game controller with sensitivity adjustment
US20120014553A1 (en) * 2010-07-19 2012-01-19 Bonanno Carmine J Gaming headset with programmable audio paths
US20130041648A1 (en) * 2008-10-27 2013-02-14 Sony Computer Entertainment Inc. Sound localization for user in motion
US8616973B2 (en) * 2005-09-15 2013-12-31 Sony Computer Entertainment Inc. System and method for control by audible device
US20140004934A1 (en) * 2012-07-02 2014-01-02 Disney Enterprises, Inc. Tv-to-game sync
US8979658B1 (en) * 2013-10-10 2015-03-17 Voyetra Turtle Beach, Inc. Dynamic adjustment of game controller sensitivity based on audio analysis

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8108509B2 (en) * 2001-04-30 2012-01-31 Sony Computer Entertainment America Llc Altering network transmitted content data based upon user specified characteristics
US8122466B2 (en) * 2001-11-20 2012-02-21 Portulim Foundation Llc System and method for updating digital media content
US8504652B2 (en) * 2006-04-10 2013-08-06 Portulim Foundation Llc Method and system for selectively supplying media content to a user and media storage device for use therein
US8909729B2 (en) * 2001-11-20 2014-12-09 Portulim Foundation Llc System and method for sharing digital media content
US7711774B1 (en) * 2001-11-20 2010-05-04 Reagan Inventions Llc Interactive, multi-user media delivery system
US7503059B1 (en) * 2001-12-28 2009-03-10 Rothschild Trust Holdings, Llc Method of enhancing media content and a media enhancement system
US20090143141A1 (en) * 2002-08-06 2009-06-04 Igt Intelligent Multiplayer Gaming System With Multi-Touch Display
US20060009287A1 (en) * 2002-08-14 2006-01-12 Koninklijke Philips Electronics N.V. Remote control using collectible object
KR100542129B1 (en) * 2002-10-28 2006-01-11 한국전자통신연구원 Object-based three dimensional audio system and control method
KR100608613B1 (en) * 2003-06-04 2006-08-03 삼성전자주식회사 Method for providing audio rendition and storage media using thereby
US7519274B2 (en) * 2003-12-08 2009-04-14 Divx, Inc. File format for multiple track digital data
US8472792B2 (en) * 2003-12-08 2013-06-25 Divx, Llc Multimedia distribution system
EP1754393B1 (en) * 2004-04-16 2020-12-02 Dolby Laboratories Licensing Corporation System and method for use in creating an audio scene
US20050246638A1 (en) * 2004-04-30 2005-11-03 Microsoft Corporation Presenting in-game tips on a video game system
US20060135232A1 (en) * 2004-12-17 2006-06-22 Daniel Willis Method and system for delivering advertising content to video games based on game events and gamer activity
CN101227958A (en) * 2005-07-25 2008-07-23 皇家飞利浦电子股份有限公司 Method and system for identifying interactive children toy
US9101279B2 (en) * 2006-02-15 2015-08-11 Virtual Video Reality By Ritchey, Llc Mobile user borne brain activity data and surrounding environment data correlation system
US7919707B2 (en) * 2008-06-06 2011-04-05 Avid Technology, Inc. Musical sound identification
US20160019598A1 (en) * 2014-07-17 2016-01-21 David Harrison Targeted advertising and attribution across multiple screens based on playing games on a game console through a television
WO2010150249A1 (en) * 2009-06-25 2010-12-29 Tictacti Ltd. A system and method for ad placement in video game content
KR101319159B1 (en) * 2009-08-25 2013-10-17 주식회사 홍인터내셔날 Game machine and method for authentification of game data thereof
US20120086630A1 (en) * 2010-10-12 2012-04-12 Sony Computer Entertainment Inc. Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
US8854298B2 (en) * 2010-10-12 2014-10-07 Sony Computer Entertainment Inc. System for enabling a handheld device to capture video of an interactive application
US8805939B2 (en) 2010-11-03 2014-08-12 Microsoft Corporation Gaming notifications aggregator
US8769169B2 (en) * 2011-09-02 2014-07-01 Microsoft Corporation Assistive buffer usage techniques
US8409000B1 (en) * 2012-03-09 2013-04-02 Hulu Llc Configuring advertisements in a video segment based on a game result
US9372531B2 (en) 2013-03-12 2016-06-21 Gracenote, Inc. Detecting an event within interactive media including spatialized multi-channel audio content
US20150223005A1 (en) * 2014-01-31 2015-08-06 Raytheon Company 3-dimensional audio projection
US9253513B1 (en) * 2014-09-08 2016-02-02 Microsoft Technology Licensing, Llc Independent multi-panel display with cross-panel interactivity
US9497505B2 (en) 2014-09-30 2016-11-15 The Nielsen Company (Us), Llc Systems and methods to verify and/or correct media lineup information

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100027837A1 (en) * 1995-05-08 2010-02-04 Levy Kenneth L Extracting Multiple Identifiers from Audio and Video Content
US6053814A (en) * 1997-12-04 2000-04-25 Logitech, Inc. System and method for automatically adjusting game controller sensitivity to player inputs
US20020161586A1 (en) * 1998-10-15 2002-10-31 Jong-Ding Wang Voice control module for controlling a game controller
US20020161462A1 (en) * 2001-03-05 2002-10-31 Fay Todor J. Scripting solution for interactive audio generation
US20080004115A1 (en) * 2003-05-20 2008-01-03 Sony Computer Entertainment America Inc. Video game method and system with content-related options
US20060233389A1 (en) * 2003-08-27 2006-10-19 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US7976385B2 (en) * 2004-05-11 2011-07-12 Mattel, Inc. Game controller with sensitivity adjustment
US20070155494A1 (en) * 2004-08-25 2007-07-05 Wells Robert V Video game system and method
US8616973B2 (en) * 2005-09-15 2013-12-31 Sony Computer Entertainment Inc. System and method for control by audible device
US20080009332A1 (en) * 2006-07-04 2008-01-10 Sony Computer Entertainment Inc. User interface apparatus and operational sensitivity adjusting method
US20090176569A1 (en) * 2006-07-07 2009-07-09 Ambx Uk Limited Ambient environment effects
US20130041648A1 (en) * 2008-10-27 2013-02-14 Sony Computer Entertainment Inc. Sound localization for user in motion
US20120014553A1 (en) * 2010-07-19 2012-01-19 Bonanno Carmine J Gaming headset with programmable audio paths
US20140004934A1 (en) * 2012-07-02 2014-01-02 Disney Enterprises, Inc. Tv-to-game sync
US8979658B1 (en) * 2013-10-10 2015-03-17 Voyetra Turtle Beach, Inc. Dynamic adjustment of game controller sensitivity based on audio analysis

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10567823B2 (en) 2008-11-26 2020-02-18 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US9716736B2 (en) 2008-11-26 2017-07-25 Free Stream Media Corp. System and method of discovery and launch associated with a networked media device
US10419541B2 (en) 2008-11-26 2019-09-17 Free Stream Media Corp. Remotely control devices over a network without authentication or registration
US10425675B2 (en) 2008-11-26 2019-09-24 Free Stream Media Corp. Discovery, access control, and communication with networked services
US9838758B2 (en) 2008-11-26 2017-12-05 David Harrison Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9848250B2 (en) 2008-11-26 2017-12-19 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9854330B2 (en) 2008-11-26 2017-12-26 David Harrison Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9866925B2 (en) 2008-11-26 2018-01-09 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US10986141B2 (en) 2008-11-26 2021-04-20 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9961388B2 (en) 2008-11-26 2018-05-01 David Harrison Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements
US9967295B2 (en) 2008-11-26 2018-05-08 David Harrison Automated discovery and launch of an application on a network enabled device
US9986279B2 (en) 2008-11-26 2018-05-29 Free Stream Media Corp. Discovery, access control, and communication with networked services
US10032191B2 (en) 2008-11-26 2018-07-24 Free Stream Media Corp. Advertisement targeting through embedded scripts in supply-side and demand-side platforms
US10074108B2 (en) 2008-11-26 2018-09-11 Free Stream Media Corp. Annotation of metadata through capture infrastructure
US10142377B2 (en) 2008-11-26 2018-11-27 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US10334324B2 (en) 2008-11-26 2019-06-25 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US9703947B2 (en) 2008-11-26 2017-07-11 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9686596B2 (en) 2008-11-26 2017-06-20 Free Stream Media Corp. Advertisement targeting through embedded scripts in supply-side and demand-side platforms
US9706265B2 (en) 2008-11-26 2017-07-11 Free Stream Media Corp. Automatic communications between networked devices such as televisions and mobile devices
US10631068B2 (en) 2008-11-26 2020-04-21 Free Stream Media Corp. Content exposure attribution based on renderings of related content across multiple devices
US10771525B2 (en) 2008-11-26 2020-09-08 Free Stream Media Corp. System and method of discovery and launch associated with a networked media device
US10791152B2 (en) 2008-11-26 2020-09-29 Free Stream Media Corp. Automatic communications between networked devices such as televisions and mobile devices
US10880340B2 (en) 2008-11-26 2020-12-29 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US10977693B2 (en) 2008-11-26 2021-04-13 Free Stream Media Corp. Association of content identifier of audio-visual data with additional data through capture infrastructure
US20180012610A1 (en) * 2013-06-19 2018-01-11 Dolby Laboratories Licensing Corporation Audio encoder and decoder with dynamic range compression metadata
US11404071B2 (en) 2013-06-19 2022-08-02 Dolby Laboratories Licensing Corporation Audio encoder and decoder with dynamic range compression metadata
US11823693B2 (en) 2013-06-19 2023-11-21 Dolby Laboratories Licensing Corporation Audio encoder and decoder with dynamic range compression metadata
US10956121B2 (en) 2013-09-12 2021-03-23 Dolby Laboratories Licensing Corporation Dynamic range control for a wide variety of playback environments
US11429341B2 (en) 2013-09-12 2022-08-30 Dolby International Ab Dynamic range control for a wide variety of playback environments
US11842122B2 (en) 2013-09-12 2023-12-12 Dolby Laboratories Licensing Corporation Dynamic range control for a wide variety of playback environments
US10972204B2 (en) 2017-06-12 2021-04-06 Gracenote, Inc. Detecting and responding to rendering of interactive video content
US10972203B2 (en) 2017-06-12 2021-04-06 Gracenote, Inc. Detecting and responding to rendering of interactive video content
US11134279B1 (en) 2017-07-27 2021-09-28 Amazon Technologies, Inc. Validation of media using fingerprinting
US11234060B2 (en) 2017-09-01 2022-01-25 Roku, Inc. Weave streaming content into a linear viewing experience
US11418858B2 (en) 2017-09-01 2022-08-16 Roku, Inc. Interactive content when the secondary content is server stitched
US11936467B2 (en) 2021-02-26 2024-03-19 Roku, Inc. Detecting and responding to rendering of interactive video content

Also Published As

Publication number Publication date
US20190278366A1 (en) 2019-09-12
US10156894B2 (en) 2018-12-18
US20200379549A1 (en) 2020-12-03
US10345892B2 (en) 2019-07-09
US10824222B2 (en) 2020-11-03
US20160139756A1 (en) 2016-05-19
US10055010B2 (en) 2018-08-21
US20180307300A1 (en) 2018-10-25
US11068042B2 (en) 2021-07-20
US20190056778A1 (en) 2019-02-21
US20140274353A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US11068042B2 (en) Detecting and responding to an event within an interactive videogame
JP6168544B2 (en) INTERACTION METHOD BASED ON MULTIMEDIA PROGRAM, TERMINAL DEVICE, AND SERVER
CN109756787B (en) Virtual gift generation method and device and virtual gift presentation system
RU2541924C2 (en) Information processing apparatus
US20160317933A1 (en) Automatic game support content generation and retrieval
JP6379107B2 (en) Information processing apparatus, control method therefor, and program
CN112511850A (en) Wheat connecting method, live broadcast display method, device, equipment and storage medium
US20230241502A1 (en) Server-Based Generation of a Help Map in a Video Game
CN114095742A (en) Video recommendation method and device, computer equipment and storage medium
KR102105525B1 (en) Method for providing game video, server for providing game video, and apparatus for executing the same
US11698927B2 (en) Contextual digital media processing systems and methods
US20110244946A1 (en) Personalized gaming experience
US20150375122A1 (en) Systems and methods for controlling multiple accounts
JP7131905B2 (en) Information processing method, server device, program, and information terminal
CN114073100B (en) Mapping view of digital content
KR20200038153A (en) Method for processing image of game on computing devices and computing devices
US20230221797A1 (en) Ephemeral Artificial Reality Experiences
CN112822558B (en) Information broadcasting method, device, equipment and medium based on online platform
US11854261B2 (en) Linking to social experiences in artificial reality environments
US20220141551A1 (en) Moving image distribution system, moving image distribution method, and moving image distribution program
US20210402300A1 (en) Real time interconnected game context and data sharing plugin framework
CN114146426A (en) Control method and device for game in secret room, computer equipment and storage medium
CN116785700A (en) Game data processing method, device, equipment and storage medium
CN114073100A (en) Mapping views of digital content
CN115484465A (en) Bullet screen generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: GRACENOTE, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BENSON, JEFF;GUBMAN, MICHAEL;KAWAHARA, CRAIG;AND OTHERS;SIGNING DATES FROM 20130307 TO 20130310;REEL/FRAME:029978/0387

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, ILLINOIS

Free format text: SECURITY INTEREST;ASSIGNOR:GRACENOTE, INC.;REEL/FRAME:032480/0272

Effective date: 20140314

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, IL

Free format text: SECURITY INTEREST;ASSIGNOR:GRACENOTE, INC.;REEL/FRAME:032480/0272

Effective date: 20140314

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS

Free format text: SECURITY AGREEMENT;ASSIGNORS:GRACENOTE, INC.;CASTTV, INC.;TRIBUNE BROADCASTING COMPANY, LLC;REEL/FRAME:039667/0565

Effective date: 20160809

Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT

Free format text: SECURITY AGREEMENT;ASSIGNORS:GRACENOTE, INC.;CASTTV, INC.;TRIBUNE BROADCASTING COMPANY, LLC;REEL/FRAME:039667/0565

Effective date: 20160809

AS Assignment

Owner name: CASTTV INC., ILLINOIS

Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041656/0804

Effective date: 20170201

Owner name: TRIBUNE MEDIA SERVICES, LLC, ILLINOIS

Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041656/0804

Effective date: 20170201

Owner name: GRACENOTE, INC., CALIFORNIA

Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041656/0804

Effective date: 20170201

Owner name: TRIBUNE DIGITAL VENTURES, LLC, ILLINOIS

Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:041656/0804

Effective date: 20170201

AS Assignment

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK

Free format text: SUPPLEMENTAL SECURITY AGREEMENT;ASSIGNORS:GRACENOTE, INC.;GRACENOTE MEDIA SERVICES, LLC;GRACENOTE DIGITAL VENTURES, LLC;REEL/FRAME:042262/0601

Effective date: 20170412

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: CITIBANK, N.A., NEW YORK

Free format text: SUPPLEMENTAL SECURITY AGREEMENT;ASSIGNORS:A. C. NIELSEN COMPANY, LLC;ACN HOLDINGS INC.;ACNIELSEN CORPORATION;AND OTHERS;REEL/FRAME:053473/0001

Effective date: 20200604

AS Assignment

Owner name: CITIBANK, N.A, NEW YORK

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE PATENTS LISTED ON SCHEDULE 1 RECORDED ON 6-9-2020 PREVIOUSLY RECORDED ON REEL 053473 FRAME 0001. ASSIGNOR(S) HEREBY CONFIRMS THE SUPPLEMENTAL IP SECURITY AGREEMENT;ASSIGNORS:A.C. NIELSEN (ARGENTINA) S.A.;A.C. NIELSEN COMPANY, LLC;ACN HOLDINGS INC.;AND OTHERS;REEL/FRAME:054066/0064

Effective date: 20200604

AS Assignment

Owner name: THE NIELSEN COMPANY (US), LLC, NEW YORK

Free format text: PARTIAL RELEASE OF SECURITY INTEREST;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:056973/0280

Effective date: 20210415

Owner name: GRACENOTE, INC., NEW YORK

Free format text: PARTIAL RELEASE OF SECURITY INTEREST;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:056973/0280

Effective date: 20210415

AS Assignment

Owner name: ROKU, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRACENOTE, INC.;REEL/FRAME:056103/0786

Effective date: 20210415

AS Assignment

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND

Free format text: PATENT SECURITY AGREEMENT SUPPLEMENT;ASSIGNOR:ROKU, INC.;REEL/FRAME:056982/0194

Effective date: 20210622

AS Assignment

Owner name: GRACENOTE DIGITAL VENTURES, LLC, NEW YORK

Free format text: RELEASE (REEL 042262 / FRAME 0601);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:061748/0001

Effective date: 20221011

Owner name: GRACENOTE, INC., NEW YORK

Free format text: RELEASE (REEL 042262 / FRAME 0601);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:061748/0001

Effective date: 20221011

AS Assignment

Owner name: ROKU DX HOLDINGS, INC., CALIFORNIA

Free format text: TERMINATION AND RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT (REEL/FRAME 056982/0194);ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:062826/0664

Effective date: 20230221

Owner name: ROKU, INC., CALIFORNIA

Free format text: TERMINATION AND RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT (REEL/FRAME 056982/0194);ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:062826/0664

Effective date: 20230221

AS Assignment

Owner name: NETRATINGS, LLC, NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: THE NIELSEN COMPANY (US), LLC, NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: GRACENOTE MEDIA SERVICES, LLC, NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: GRACENOTE, INC., NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: EXELATE, INC., NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: A. C. NIELSEN COMPANY, LLC, NEW YORK

Free format text: RELEASE (REEL 053473 / FRAME 0001);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063603/0001

Effective date: 20221011

Owner name: NETRATINGS, LLC, NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

Owner name: THE NIELSEN COMPANY (US), LLC, NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

Owner name: GRACENOTE MEDIA SERVICES, LLC, NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

Owner name: GRACENOTE, INC., NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

Owner name: EXELATE, INC., NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

Owner name: A. C. NIELSEN COMPANY, LLC, NEW YORK

Free format text: RELEASE (REEL 054066 / FRAME 0064);ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:063605/0001

Effective date: 20221011

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8