WO2009033187A1 - System and method for detecting viewer attention to media delivery devices - Google Patents

System and method for detecting viewer attention to media delivery devices Download PDF

Info

Publication number
WO2009033187A1
WO2009033187A1 PCT/US2008/075649 US2008075649W WO2009033187A1 WO 2009033187 A1 WO2009033187 A1 WO 2009033187A1 US 2008075649 W US2008075649 W US 2008075649W WO 2009033187 A1 WO2009033187 A1 WO 2009033187A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
delivery device
media
media delivery
detector
Prior art date
Application number
PCT/US2008/075649
Other languages
French (fr)
Inventor
Hans C. Lee
Michael J. Lee
Tim Hong
William H. Williams
Original Assignee
Emsense Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emsense Corporation filed Critical Emsense Corporation
Priority to EP08829304A priority Critical patent/EP2208346A1/en
Publication of WO2009033187A1 publication Critical patent/WO2009033187A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/45Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying users
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/37Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying segments of broadcast information, e.g. scenes or extracting programme ID
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/38Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space
    • H04H60/41Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast space, i.e. broadcast channels, broadcast stations or broadcast areas
    • H04H60/43Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast space, i.e. broadcast channels, broadcast stations or broadcast areas for identifying broadcast channels

Abstract

Embodiments of a system to accurately record if viewers are actually watching, listening to, interacting with, or otherwise perceiving a television, computer monitor, or other media delivery device are described A detector circuit is coupled to a media delivery device and configured to receive a signal transmitted from an emitter placed on the body of a user positioned proximate the media delivery device An attention detector processor is coupled to the detector circuit and configured to determine whether the user is perceiving content provided by the media deliver device The system records physiological data from viewers and transmits it to a central location An integrated physiological sensing device measures viewers' cognitive and emotional responses to media and transmits to a base station in close proximity to the sensing device The base station records these responses to the media and combines them with context data from the electronics equipment

Description

SYSTEM AND METHOD FOR DETECTING VIEWER ATTENTION TO MEDIA DELIVERY DEVICES
Inventors:
Hans C. Lee
Michael Lee
Tim Hong
William H. Williams
CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation in part application of United States (US) Patent Application Number 11/681,265, filed March 2, 2007.
This application is a continuation in part application of US Patent Application Number 11/804,517, filed May 17, 2007.
This application claims the benefit of US Patent Application Number 60/970,898, filed September 7, 2007.
This application claims the benefit of US Patent Application Number 60/970,900, filed September 7, 2007.
This application claims the benefit of US Patent Application Number 60/970,905, filed September 7, 2007.
This application claims the benefit of US Patent Application Number 60/970,908, filed September 7, 2007
This application claims the benefit of US Patent Application Number 60/970,913, filed September 7, 2007.
The present application claims the benefit of the U.S. Provisional Application No. 60/970,916 entitled "Methods and Systems for Media Viewer Attention Detection Using Means for Improving Information About Viewer's Preferences, Media Viewing Habits, and Other Factors," and filed on September 7, 2007.
The present application claims the benefit of the U. S Provisional Application No. 60/970,920 entitled "Measuring Physiological Response to Media for Viewership Modeling By Integrating Into Home Electronics," and filed on September 7, 2007 FIELD
Embodiments of the invention relate generally to media playback systems, and more specifically, to user awareness detection systems for televisions, computer monitors, and other media display devices.
BACKGROUND
Display devices, such as televisions, computer monitors, personal digital devices, and the like are the principal means of delivering electronic content. Content providers can deliver virtually any type of visual content through a myriad number of display devices. The most common display means has traditionally been the television, however, the advent of the Internet and other networks has led to an increase in viewing through computers, game device, and other media playback units Although certain user activity can be tracked and measured with regard to content delivery, such as network sites visited or television shows tuned into, there is no present way of knowing whether a person is actually viewing, reading, or otherwise perceiving what is displayed, when a television or computer monitor is turned on.
A significant disadvantage associated with current media research is the reliance on knowing the number of viewers who are watching a specific piece of media, for example a show or commercial on TV. The issue is that current technologies can only record when a television is on, but are not able to take into account that much of the time that the television or web pages are visible, people are not looking at them, but are instead out of the room or otherwise engaged.
Likewise, with computer systems, it may be possible to determine what content or network sites a user may access, but it is generally not possible to know whether or not the user is actually attending to or perceiving the information on the screen.
In addition, present systems have no way of indicating how a particular viewer feels about the content. In general, emotions are a key indicator of how well viewers like or dislike a particular media item, and if they will likely want to watch it in again in the future. Such information is not currently available, except for experiments involving a small number of people in a laboratory setting. One key issue with present systems is that they typically do not adequately measure an effectively wide enough range of emotions. A second issue with present systems is that even if physiological data that reflects emotions can be recorded, it is generally not converted to a usable form. Such data is not generally obtained for very large numbers of people in their natural settings, such as sitting at home while watching television or working on a computer Therefore, such data cannot be made available to agencies or entities that can utilize this emotional response data.
INCORPORATION BY REFERENCE
Each patent, patent application, and/or publication mentioned in this specification is herein incorporated by reference in its entirety to the same extent as if each individual patent, patent application, and/or publication was specifically and individually indicated to be incorporated by reference.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which.
Figure 1 illustrates an emitter-receiver based viewer attention detection system, under an embodiment.
Figure 2 illustrates an emitter-receiver based viewer attention detection system, under an alternative embodiment.
Figure 3 illustrates a camera-based viewer attention detection system, under an embodiment.
Figure 4 is a flowchart that illustrates a method of detecting and utilizing detected viewer attention to a media delivery device, under an embodiment.
Figure 5 illustrates a viewer physiological measurement system, under an embodiment.
Figure 6 illustrates example of some of the characteristics for use by the sensors of Figure 1 , under an embodiment.
Figure 7 illustrates an aggregate trace profile for use by the physiological sensing process, under an embodiment. DETAILED DESCRIPTION
Embodiments of a system to accurately record if viewers are actually watching, listening to, interacting with, or otherwise perceiving a media deliver device, such as a television, computer monitor, or other display mechanism at any given moment are described. A system is configured to sense when a viewer is actually watching television or another electronic device, and make it possible to know when they can be meaningfully engaged by the media. This knowledge can be used by market research entities to measure what media is being viewed and how actively it is being viewed. This can range from users passively watching the screen, or actively paying attention to the screen, or not even viewing the screen at all. The system includes means to sense if a viewer is oriented towards a TV/Radio/Monitor or other media delivery device. Such a system can overcome the disadvantages associated with present systems that generally have problems predicting accurate models of viewership. Detecting User Attention
In one embodiment, an emitter is attached to each viewer. The emitter sends out a signal only in the direction the viewer is looking. The system has a receiver for this signal placed in close proximity to the media device, such as a TV, monitor or radio. If the signal is received, then it is assumed that the viewers head is oriented in the right direction to view the monitor If the user leaves the room or looks the other way, the signal will diminish and disappear. Figure 1 illustrates an emitter-receiver based viewer detection system, under an embodiment. As shown in Figure 1 , media delivery device (or "monitor") 102 comprises a display device configured to display any type of visual content, such as streaming video, still pictures, or any other visually perceivable image in analog or digital format. The media delivery device 102 may be embodied in a television, computer monitor, electronic tablet, or any other electronic display device. An audio playback unit, such as speaker 112 may be coupled to or incorporated in the media delivery device to provide audio output for analog or digital sound signals. A user 104 is positioned to perceive the video and/or audio signals from the media delivery device 102 Although the user may be positioned at an appropriate distance to receive the audio and visual signals, it is not always apparent whether or not the user is actually paying attention to the content. For the embodiment of Figure 1 , the user has an emitter device 110 attached to part of the user's body, such as his or her head 104. The emitter is aligned with the optimum direction of perception through either or both of the eyes and ears of the user. The emitter transmits signals 101 corresponding to the user's line-of-sight 103. A detector circuit 106 included within, or coupled to the monitor 102 is positioned to receive the emitted signals 101 When the user's face 104 is directed to the monitor 102, as indicated by the line-of-sight 103, the detector will receive the emitted signals 101 at or near full strength. Depending upon implementation, a range of signal strengths may be defined in which a received signal indicates that the user is looking at the monitor. The detected signals received by detector 106 are processed in an attention detector processor 108. In one embodiment, the emitter 110 may be implemented as a headset, headband, eyeglass lens system, or any similar system that is aligned to the user 's eyes and sights along the user's line of sight when the user is looking straight ahead.
In an alternative embodiment, the emitter may be placed on the media device, with the receiver placed on the user that measures if the signal is visible to the viewer The user-based receiver can then transmit this information back to a base station either through wired or wireless means. Figure 2 illustrates an emitter-receiver based viewer detection system, under this alternative embodiment. As shown in Figure 2, monitor 202 and any associated audio playback component 212 are coupled to an emitter component 206. A user 204 is positioned to perceive the video and/or audio signals from the media delivery device 202. For the embodiment of Figure 2, the user has a detector device 210 attached to part of the user's body, such as his or her head 204. The emitter 206 is aligned with the optimum direction of perception through either or both of the eyes and ears of the user. The emitter transmits signals 201 in a direction corresponding to an optimum line-of-sight for viewing of the monitor If the user 204 is in this optimum ling-of- sight 203 position, the detector 210 attached to the user will receive the emitted signals 201 at or near full strength. Depending upon implementation, a range of signal strengths may be defined in which a received signal indicates that the user is looking at the monitor. The detected signals received by detector 210 are transmitted back to an attention detector processor 208. In one embodiment, the detector 210 may be implemented as a headset, headband, eyeglass lens system, or any similar system that is aligned to the user 's eyes and sights along the user's line of sight when the user is looking straight ahead
For the embodiments of Figures 1 and 2, the emitter can be an infrared emitter/detector. In an alternative embodiment, the emitter is an ultrasound emitter/detector. In a further alternative embodiment, the emitter and detector utilize laser technology. In yet a further alternative embodiment, a flickering light at a predetermined frequency is utilized. Other comparable emitters and sensors, known to those of ordinary skill in the art can also be used. In addition, combinations of any of these methods can also work.
The embodiments of Figures 1 and 2 require an emitter/detector system that is distributed between the user and the media delivery device. In an alternative embodiment, detection of the user's orientation with respect to the media delivery device is accomplished by imaging the user's orientation in front of the monitor. For this embodiment, a camera is placed in close proximity to the media device, and a processing unit detects if a user is properly positioned in front of the monitor to indicate whether the user is perceiving the content provided by the monitor. Figure 3 illustrates a camera-based viewer attention detection system, under an embodiment A camera incorporated in, or coupled to the monitor 302 is oriented to image a field of view 301 in front of the monitor. The camera may be a still picture camera, video camera, or any similar image capture device and may be analog or digital-based The camera 320 can be a single camera, a stereo-pair, or a system of cameras.
The field of view 301 imaged by the camera 320 corresponds to an optimum line-of-sight 303 when a user 304 is viewing the monitor 302 from a head-on or nearly head-on orientation. The camera 320 is configured to detect if there is a person in front of the monitor, and more specifically if the user's face is pointed towards the monitor. The camera images within a specific field of focus and transmits images to an image processor component 310. The image processor component includes functions, such as face recognition software that determines whether user is looking at the monitor screen. In certain implementations, the direction of the user's eyes can be determined to make sure that the user is focusing on the screen, rather than just having their face in the direction of the screen. In one embodiment, the image data from the image processor 310 is passed onto an attention detector processor 308 for further processing.
It should be noted that any of the connections between the components in any of Figures 1-3 may be implemented through wired or wireless communication means Likewise, in certain implementations, a computer-based network may be used to transmit one or more signals or data among the components
In one embodiment, the user may be outfitted with an accelerometer that is attached to a portion of his or her body, such as the head, face, neck, torso, etc. The orientation of the accelerometer can be detected by the attention detector processor 308 to determine if the user is facing the monitor 302 screen. For this embodiment, the accelerometer circuit is attached to a portion of a user positioned proximate the media delivery device at a distance suitable to perceive the monitor. The accelerometer is configured to provide an indication of the position of the user's head relative to the media delivery device. A detector circuit can be coupled to the monitor to receive a signal transmitted from the accelerometer. An attention detector processor coupled to the detector circuit can be configured to determine whether the user is perceiving content provided by the monitor based on one or more signals from the accelerometer.
In general, the viewer attention detection system according to embodiments can detect if a viewer is oriented directly towards the media delivery device. This provides a relatively reasonable indication that the user is paying attention to the media being delivered, and can also help to indicate instances when the user is not paying attention to the media. This information can be utilized by content providers for various purposes. For example, the percentage of time that a user is actively watching the media delivery device relative to the total time the device is powered on can define an "engagement" metric. Very good or engaging media will typically make people want to watch it and they will be glued to their media delivery devices, while less engaging media, even if it is being transmitted to the viewer, may not be actively watched. This is a key new metric for media analysis.
Figure 4 is a flowchart that illustrates a method of detecting and utilizing detected viewer attention to a media delivery device, under an embodiment. In block 402, the system detects the direction of the attention of the user with respect to the media delivery device. This detection can be performed by the emitter/detector, camera-based, or accelerometer-based systems described above. The time period that the user attention is directed the media delivery device is then measured, block 404 An engagement metric that represents the attention time relative to the total power on time of the device is then generated for the measured time period, block 406.
Another advantage of the attention detection system is aggregating this viewer "engagement" and watching time over very large numbers of participants to create models of viewership for given media types. This information can then be used as a baseline to identify how engaging each type of media is relative to other competitive sources. For example, knowing that a piece of media engages viewers for 60% of the time with them actively watching/listening to the media is an important measure However, the key information is, given its media type, what is the relative engagement to its competition where the competition average provides a benchmark. If the media is, for example a TV program for a round of golf, and the average time for viewers watch golf is usually 30%, then a 60% engagement measure in this case would be good. On the other hand, if the content was a thriller and the average time watching thrillers is 90+%, then a 60% measure would indicate that the show was not particularly engaging
This information can then be used to rate show viewership very accurately and provide a measure of the overall engagement by viewers. In one embodiment, the attention detection processing system can be deployed in viewer 's homes as part of the usual delivery devices, such as the television. This would allow a great many number of users' responses to be simultaneously measured and aggregated. Such a system can be used by television rating services to provide a more accurate measure of actual user interest, rather than just television tuning measurements. Measuring Viewer Emotional Response
Embodiments of a system to accurately record a viewer's emotion as he or she is watching or listening to media content are described. The system records physiological data from viewers and transmits it back to a central location as viewers watch, listen to, or otherwise interact with media such as TV, radio, video games, web sites or other media. An integrated physiological sensing device measures viewers' cognitive and emotional responses to media and transmits to a base station in close proximity to the sensing device, The physiological sensing base station can be integrated into home electronics devices such as digital video recorders, TV cable boxes, video cassette recorders, DVD players and gaming systems to record viewer's emotional and cognitive responses to the media and combining this response with context data from the electronics equipment to know what the viewer is responding to. Such a system allows data to be recorded for a very large number of people simultaneously as they sit in front of their televisions or computers.
Viewers react emotionally and rationally to media content, Initial reactions are emotional as viewers rationalize their reactions to media messages and then act on them emotionally. Understanding these reactions and how they impact thoughts and feelings about the message can be critical to creating effective and useful media content. In one embodiment, a physiological response measurement system incorporating a scalable physiological and brainwave measurement technology provides accurate, objective, and moment-by-moment analysis of how a viewer responds emotionally and cognitively to media messaging.
In one embodiment, one or more physiological sensors are attached to a viewer. The sensors measure certain physiological characteristics of the viewer that are relevant to emotional state while the viewer is perceiving media content. The signals recorded for the viewer can be for heart rate, EEG (electroencephalography measurements), EKG (electrocardiogram measurements), BVP, motion, position, temperature, galvanic skin response or other physiological indicators and are can be measured through sensors placed on or near the viewer. The sensor may include a set of electrodes or any appropriate fitting that attach to one or more portions of the viewer's body. In one embodiment, the sensors are provided in a sensor1 assembly or headset unit that attaches to the viewer's head to obtain brain wave measurements,
Figure 5 illustrates a viewer physiological measurement system, under an embodiment. As shown in Figure 5, a media delivery device (or "monitor") 502 comprises a display device configured to display any type of visual content, such as streaming video, still pictures, or any other visually perceivable image in analog or digital format. The media delivery device 502 may be embodied in a television, computer monitor, electronic tablet, or any other electronic display device. An audio playback unit, such as speaker 512 may be coupled to or incorporated in the media delivery device to provide audio output for analog or digital sound signals. A viewer 504 is positioned to perceive the video and/or audio signals from the media delivery device 502.
For the embodiment of Figure 5, the viewer has one or more physiological sensors 510 attached to or in proximity to appropriate parts of the user's body, such as his or her head 504. The sensors are configured to measure relevant physiological characteristics of the viewer. Various different physiological characteristics may be measured. Figure 6 illustrates example of some of the characteristics for use by the sensors of Figure 5. As shown in Figure 6, the sensor or sensors can be configured to measure brainwave activity 602, breath 604, heartbeat 606, eye movement 608, body motion 610, temperature 612, and any other relevant measurement.
As a user views, listens to or otherwise perceives 501 content provided by media delivery device 502 he or she registers appropriate physiological responses that are picked up by the sensors 510. This information is transmitted back to a central process so that viewer's physiological responses can be aggregated to create models of emotional and cognitive engagement in response to media.
In one embodiment, the central process comprises a physiological sensing process that is executed in a device 506 coupled to the media delivery device 502. For this embodiment, device 506 includes a processor and receiver to receive data from the external physiological sensing device 510 Transmission over line 503 can be done with a wired or wireless receiver that interfaces with the physiological sensing rig or sensor assembly.
The physiological sensing process 508 interprets the information received from the physiological sensors and then creates a packet of information that can be electronically sent out over the internet, over telephone lines or through other means to a central location(s) for use by other entities or agencies.
In one embodiment, the physiological sensing process 508 can be integrated within a media playback unit or source device that is closely coupled to or even incorporated within media delivery device 502. Such a device can be a home electronics device such as a digital video recorders, cable box, video cassette recorders, DVD player, gaming system, or any similar device In an alternative embodiment, the device 508 may be a separate device to playback device 508. Such a device records the media that the viewer is watching and also the physiological data and then sends both back to a central processing location. The media can then be analyzed to define what exactly the viewer was watching based on the sounds and visual content the viewer saw
In one embodiment, the physiological information can be transmitted back over the cable over which the television signal was sent, to a central location. This can also be done using a telephone line, DSL (digital subscriber line) or wireless connection. The physiological data for each viewer is put into an electronic "packet" along with data about the media and sent to a new location that aggregates the information.
It should be noted that any of the connections between the components in any of Figure 5 may be implemented through wired or wireless communication means. Likewise, in certain implementations, a computer-based network may be used to transmit one or more signals or data among the components.
A content analyzer component 514 may also be provided to analyze the audio/video content to extract meaningful information such as the media title (e.g., song name or movie name), where the content on a digital video recorder is, which radio station or television station is playing, and so on.
In this case, the physiological signals are tagged with a marker that defines which media segment the response corresponds to, such as a TV show, time-code, channel number, commercial, movie, video game segment/position, radio station, song, seconds into a show, recorded versus live on TV, etc. This data can then be interpreted to define emotional and cognitive responses second-by-second in correspondence with the media content.
Besides physiological data, certain user profile information can also be used. User profile information can include objective information about the viewer, such as age, gender, income and other indicators. This set of information can be sent back to a central location either back over the same TV cable, or over a telephone line or internet connection or other communication method. It can also be stored for later retrieval. The disclosed system for integrating recorders for physiological sensors into home electronic devices is a key advance that enables physiological data to be recorded in home in a way it would never otherwise be able to be recorded. This replaces the arcane viewership modeling based solely on how many TVs or radios are on, with actual emotional response of viewers to each piece of media.
The ability to track and measure involuntary and unfiltered brainwave responses to media content can provide major tools to marketers who understand that consumers do not make decisions in a purely rational, linear fashion, and that emotion has a predominant impact on these responses. The sensor system provides a noninvasive tracking of certain body responses. The sensing process 108 provides non- biased assessment of media and message content. In one embodiment, the sensing process 108 provides an aggregate trace profile that provides a picture of the emotional response of the viewer. This profile can be a combination of any or all of the responses provided by the sensors and illustrated in Figure 6. Figure 7 illustrates an aggregate trace profile for use by the physiological sensing process, under an embodiment.
This information can then be used by content providers to model or predict viewership response to certain media content, and modify such content accordingly. Such content can comprise television shows, movies, songs, advertisements, video games, still pictures, spoken audio content, displayed text content (e.g., e-books and the like) or any other similar electronically distributed media content.
Embodiments are directed to a system comprising: a media delivery device; a detector circuit coupled to the media delivery device and configured to receive a signal transmitted from an emitter placed on the body of a user positioned proximate the media delivery device; and an attention detector processor coupled to the detector circuit and configured to determine whether the user is perceiving content provided by the media deliver device In this system, the emitter and detector components utilize a transmission medium selected from the group consisting of: infrared transmission, ultrasound transmission, laser technology, and flickering light at a predetermined frequency The media delivery device may be one of a television or a computer monitor. In this system, the emitter is placed in a head geai positioned on the head of the user and positioned to transmit the signal in a direction coπesponding to the line- of-sight of the user, and wherein when the user positions him or herself in a manner that indicates that the user is watching or otherwise paying attention to the media delivery device the detector receives the signal from the emitter.
Embodiments are further directed to a system comprising: a media delivery device; an emitter circuit coupled to the media delivery device and configured to transmit a signal to be received by a detector placed on the body of a user positioned proximate the media delivery device, wherein the detector is configured to transmit an indicator in the event the detector receives the signal; and an attention detector processor coupled to the emitter circuit and configured to receive the indicator from the detector when the signal from the emitter is received by the detector, in order to determine whether the user is perceiving content provided by the media deliver device
In this system, the emitter and detector components utilize a transmission medium selected from the group consisting of: infrared transmission, ultrasound transmission, laser technology, and flickering light at a predetermined frequency. The media delivery device may be one of a television or a computer monitor The emitter may be placed in a head gear positioned on the head of the user and positioned to transmit the signal in a direction corresponding to the line-of-sight of the user, and wherein when the user positions him or herself in a manner that indicates that the user is watching or otherwise paying attention to the media delivery device the detector receives the signal from the emitter.
Embodiments are further directed to a system comprising: a media delivery device; a camera coupled to the media delivery device and configured to image an area corresponding to a viewing area in front of the media delivery device; an image processor coupled to the camera and configured to recognize the presence of a user's face within the viewing area; and an attention detector processor coupled to the image processor and configured to determine whether the user is perceiving content provided by the media deliver device. The camera can be one of a still image camera or a video camera, and the media delivery device is one of a television or a computer monitor. Embodiments include a system comprising: a media delivery device; an accelerometer circuit attached to a portion of a user positioned proximate the media delivery device at a distance suitable to perceive the media delivery device, the accelerometer configured to provide an indication of the position of the user's head relative to the media delivery device; a detector circuit coupled to the media delivery device and configured to receive a signal transmitted from the accelerometer; and an attention detector processor coupled to the detector circuit and configured to determine whether the user is perceiving content provided by the media delivery device based on one or more signals from the accelerometer.
In this system, the portion of the user is selected from the group consisting of the user's head, face, neck, and torso. The media delivery device can be one of a television or a computer monitor.
Embodiments are also directed to a system comprising: a media delivery device configured to provide media content to a viewer; a sensor assembly attached to one or more parts of a user's body, and configured to measure one or more physiological characteristics of the user while the viewer perceives the media content provided by the media delivery device; a receiver circuit configured to receive data relating to the one or more physiological characteristics of the user; and a physiological sensing process configured to interpret the physiological data to assess an emotional response of the viewer to the media content.
This system further comprises a content analyzer component configured to extract certain information from the media content, and to correlate specific portions of the media content with specific emotional responses.
The system further comprises a viewer profile component providing objective viewer profile information to supplement the emotional response.
In this system, the physiological characteristics are selected from the group consisting of: brainwave activity, breath, heartbeat, eye movement, body motion, and temperature.
In this system, the media delivery device is one of a television or a computer monitor. The sensor assembly can comprise a head gear positioned on the head of the user and positioned to transmit the signal in the direction of the media delivery device. The physiological sensing process can be integrated within a media playback device coupled to the media delivery device, and the media playback device is selected from the group consisting of: a digital video recorder, a TV cable box, a video cassette recorder, DVD player, and a gaming system.
Aspects of the embodiments described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices ("PLDs"), such as field programmable gate arrays ("FPGAs"), programmable array logic ("PAL") devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects of the method include: microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc. Furthermore, aspects of the described method may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e g , metal-oxide semiconductor field-effect transistor ("MOSFET") technologies like complementary metal-oxide semiconductor ("CMOS"), bipolar technologies like emitter-coupled logic ("ECL"), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, and so on.
It should also be noted that the various functions disclosed herein may be described using any number of combinations of hardware, firmware, and/or as data and/or instructions embodied in various machine-readable or computer-readable media, in terms of their behavioral, register transfer, logic component, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, and so on).
Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of "including, but not limited to." Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words "herein," "hereunder," "above," "below," and words of similar import refer to this application as a whole and not to any particular portions of this application. When the word "or" is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.
The above description of illustrated embodiments is not intended to be exhaustive or to limit the embodiments to the precise form or instructions disclosed. While specific embodiments of, and examples for, the disclosed system are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the described embodiments, as those skilled in the relevant art will recognize.
The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the online loan application system in light of the above detailed description.
In general, in any following claims, the terms used should not be construed to limit the described system to the specific embodiments disclosed in the specification and the claims, but should be construed to include all operations or processes that operate under the claims. Accordingly, the described system is not limited by the disclosure, but instead the scope of the recited method is to be determined entirely by the claims.
While certain aspects of the system may be presented in certain claim forms, the inventor contemplates the various aspects of the methodology in any number of claim forms For example, while only one aspect of the system is recited as embodied in machine-readable medium, other aspects may likewise be embodied in machine- readable medium Accordingly, the inventor reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the described systems and methods.

Claims

CLAIMS.What is claimed is:
1. A system comprising: a media delivery device; a detector circuit coupled to the media delivery device and configured to receive a signal transmitted from an emitter placed on the body of a user positioned proximate the media delivery device; and an attention detector processor coupled to the detector circuit and configured to determine whether the user is perceiving content provided by the media deliver device
2 The system of claim 1 wherein the emitter and detector components utilize a transmission medium selected from the group consisting of: infrared transmission, ultrasound transmission, laser technology, and flickering light at a predetermined frequency.
3. The system of claim 2 wherein the media delivery device is one of a television or a computer monitor.
4 The system of claim 1 wherein the emitter is placed in a head gear positioned on the head of the user and positioned to transmit the signal in a direction coπesponding to the line-of-sight of the user, and wherein when the user positions him or herself in a manner that indicates that the user is watching or otherwise paying attention to the media delivery device the detector receives the signal from the emitter.
5. A system comprising: a media delivery device; an emitter circuit coupled to the media delivery device and configured to transmit a signal to be received by a detector placed on the body of a user positioned proximate the media delivery device, wherein the detector is configured to transmit an indicator in the event the detector receives the signal; and an attention detector processor coupled to the emitter circuit and configured to receive the indicator from the detector when the signal from the emitter is received by the detector, in order to determine whether the user is perceiving content provided by the media deliver device.
6. The system of claim 5 wherein the emitter and detector components utilize a transmission medium selected from the group consisting of. infrared transmission, ultrasound transmission, laser technology, and flickering light at a predetermined frequency.
7. The system of claim 6 wherein the media delivery device is one of a television or a computer monitor.
8. The system of claim 5 wherein the emitter is placed in a head gear positioned on the head of the user and positioned to transmit the signal in a direction corresponding to the line-of-sight of the user, and wherein when the user positions him or herself in a manner that indicates that the user is watching or otherwise paying attention to the media delivery device the detector receives the signal from the emitter.
9. A system comprising: a media delivery device; a camera coupled to the media delivery device and configured to image an area corresponding to a viewing area in front of the media delivery device; an image processor coupled to the camera and configured to recognize the presence of a user's face within the viewing area; and an attention detector processor coupled to the image processor and configured to determine whether the user is perceiving content provided by the media deliver device
10 The system of claim 9 wherein the camera is one of a still image camera or a video camera.
11 The system of claim 10 wherein the media delivery device is one of a television or a computer monitor.
12. A system comprising. a media delivery device; an accelerometer circuit attached to a portion of a user positioned proximate the media delivery device at a distance suitable to perceive the media delivery device, the accelerometer configured to provide an indication of the position of the user's head relative to the media delivery device; a detector circuit coupled to the media delivery device and configured to receive a signal transmitted from the accelerometer; and an attention detector processor coupled to the detector circuit and configured to determine whether the user is perceiving content provided by the media delivery device based on one or more signals from the accelerometer
13. The system of claim 12 wherein the portion of the user is selected from the group consisting of the user's head, face, neck, and torso.
14. The system of claim 13 wherein the media delivery device is one of a television or a computer monitor.
15. A system comprising: a media delivery device configured to provide media content to a viewer; a sensor assembly attached to one or more parts of a user's body, and configured to measure one or more physiological characteristics of the user while the viewer perceives the media content provided by the media delivery device; a receiver circuit configured to receive data relating to the one or more physiological characteristics of the user; and a physiological sensing process configured to interpret the physiological data to assess an emotional response of the viewer to the media content.
16. The system of claim 15 further comprising a content analyzer component configured to extract certain information from the media content, and to correlate specific portions of the media content with specific emotional responses.
17. The system of claim 16 further comprising a viewer profile component providing objective viewer profile information to supplement the emotional response
18. The system of claim 15 wherein the physiological characteristics are selected from the group consisting of: brainwave activity, breath, heartbeat, eye movement, body motion, and temperature
19. The system of claim 15 wherein the media delivery device is one of a television or a computer monitor.
20. The system of claim 15 wherein the sensor assembly comprises a head gear positioned on the head of the user and positioned to transmit the signal in the direction of the media delivery device.
21. The system of claim 15 wherein the physiological sensing process is integrated within a media playback device coupled to the media delivery device.
22. The system of claim 21 wherein the media playback device is selected from the group consisting of: a digital video recorder, a TV cable box, a video cassette recorder, DVD player, and a gaming system
PCT/US2008/075649 2007-09-07 2008-09-08 System and method for detecting viewer attention to media delivery devices WO2009033187A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP08829304A EP2208346A1 (en) 2007-09-07 2008-09-08 System and method for detecting viewer attention to media delivery devices

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US97089807P 2007-09-07 2007-09-07
US97090807P 2007-09-07 2007-09-07
US97091607P 2007-09-07 2007-09-07
US97090507P 2007-09-07 2007-09-07
US97091307P 2007-09-07 2007-09-07
US97090007P 2007-09-07 2007-09-07
US97092007P 2007-09-07 2007-09-07
US60/970,900 2007-09-07
US60/970,920 2007-09-07
US60/970,908 2007-09-07
US60/970,913 2007-09-07
US60/970,898 2007-09-07
US60/970,905 2007-09-07
US60/970,916 2007-09-07

Publications (1)

Publication Number Publication Date
WO2009033187A1 true WO2009033187A1 (en) 2009-03-12

Family

ID=40429431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/075649 WO2009033187A1 (en) 2007-09-07 2008-09-08 System and method for detecting viewer attention to media delivery devices

Country Status (2)

Country Link
EP (1) EP2208346A1 (en)
WO (1) WO2009033187A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011071461A1 (en) * 2009-12-10 2011-06-16 Echostar Ukraine, L.L.C. System and method for selecting audio/video content for presentation to a user in response to monitored user activity
EP2466771A1 (en) * 2010-12-16 2012-06-20 Gérard Olivier Smart audience monitoring device
US20130089006A1 (en) * 2011-10-05 2013-04-11 Qualcomm Incorporated Minimal cognitive mode for wireless display devices
EP2710752A1 (en) * 2011-05-17 2014-03-26 Webtuner Corporation System and method for scalable, high accuracy, sensor and id based audience measurement system
US9361005B2 (en) 2013-12-27 2016-06-07 Rovi Guides, Inc. Methods and systems for selecting modes based on the level of engagement of a user
US9426525B2 (en) 2013-12-31 2016-08-23 The Nielsen Company (Us), Llc. Methods and apparatus to count people in an audience
US9854581B2 (en) 2016-02-29 2017-12-26 At&T Intellectual Property I, L.P. Method and apparatus for providing adaptable media content in a communication network
KR20190135315A (en) * 2018-05-28 2019-12-06 광운대학교 산학협력단 Wearable device and method for determining concentration degree of user
US11949965B1 (en) 2023-06-23 2024-04-02 Roku, Inc. Media system with presentation area data analysis and segment insertion feature

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10187694B2 (en) 2016-04-07 2019-01-22 At&T Intellectual Property I, L.P. Method and apparatus for enhancing audience engagement via a communication network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010056225A1 (en) * 1995-08-02 2001-12-27 Devito Drew Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein
US20030126593A1 (en) * 2002-11-04 2003-07-03 Mault James R. Interactive physiological monitoring system
US6792304B1 (en) * 1998-05-15 2004-09-14 Swinburne Limited Mass communication assessment system
US20040208496A1 (en) * 2003-04-15 2004-10-21 Hewlett-Packard Development Company, L.P. Attention detection
US20050071865A1 (en) * 2003-09-30 2005-03-31 Martins Fernando C. M. Annotating meta-data with user responses to digital content
US20050172311A1 (en) * 2004-01-31 2005-08-04 Nokia Corporation Terminal and associated method and computer program product for monitoring at least one activity of a user

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010056225A1 (en) * 1995-08-02 2001-12-27 Devito Drew Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein
US6792304B1 (en) * 1998-05-15 2004-09-14 Swinburne Limited Mass communication assessment system
US20030126593A1 (en) * 2002-11-04 2003-07-03 Mault James R. Interactive physiological monitoring system
US20040208496A1 (en) * 2003-04-15 2004-10-21 Hewlett-Packard Development Company, L.P. Attention detection
US20050071865A1 (en) * 2003-09-30 2005-03-31 Martins Fernando C. M. Annotating meta-data with user responses to digital content
US20050172311A1 (en) * 2004-01-31 2005-08-04 Nokia Corporation Terminal and associated method and computer program product for monitoring at least one activity of a user

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8793727B2 (en) 2009-12-10 2014-07-29 Echostar Ukraine, L.L.C. System and method for selecting audio/video content for presentation to a user in response to monitored user activity
WO2011071461A1 (en) * 2009-12-10 2011-06-16 Echostar Ukraine, L.L.C. System and method for selecting audio/video content for presentation to a user in response to monitored user activity
EP2466771A1 (en) * 2010-12-16 2012-06-20 Gérard Olivier Smart audience monitoring device
US9635405B2 (en) 2011-05-17 2017-04-25 Webtuner Corp. System and method for scalable, high accuracy, sensor and ID based audience measurement system based on distributed computing architecture
EP2710752A4 (en) * 2011-05-17 2014-10-22 Webtuner Corp System and method for scalable, high accuracy, sensor and id based audience measurement system
EP2710752A1 (en) * 2011-05-17 2014-03-26 Webtuner Corporation System and method for scalable, high accuracy, sensor and id based audience measurement system
US20130089006A1 (en) * 2011-10-05 2013-04-11 Qualcomm Incorporated Minimal cognitive mode for wireless display devices
US9361005B2 (en) 2013-12-27 2016-06-07 Rovi Guides, Inc. Methods and systems for selecting modes based on the level of engagement of a user
US9918126B2 (en) 2013-12-31 2018-03-13 The Nielsen Company (Us), Llc Methods and apparatus to count people in an audience
US9426525B2 (en) 2013-12-31 2016-08-23 The Nielsen Company (Us), Llc. Methods and apparatus to count people in an audience
US10560741B2 (en) 2013-12-31 2020-02-11 The Nielsen Company (Us), Llc Methods and apparatus to count people in an audience
US11197060B2 (en) 2013-12-31 2021-12-07 The Nielsen Company (Us), Llc Methods and apparatus to count people in an audience
US11711576B2 (en) 2013-12-31 2023-07-25 The Nielsen Company (Us), Llc Methods and apparatus to count people in an audience
US9854581B2 (en) 2016-02-29 2017-12-26 At&T Intellectual Property I, L.P. Method and apparatus for providing adaptable media content in a communication network
US10455574B2 (en) 2016-02-29 2019-10-22 At&T Intellectual Property I, L.P. Method and apparatus for providing adaptable media content in a communication network
KR20190135315A (en) * 2018-05-28 2019-12-06 광운대학교 산학협력단 Wearable device and method for determining concentration degree of user
KR102231979B1 (en) 2018-05-28 2021-03-24 광운대학교 산학협력단 Wearable device and method for determining concentration degree of user
US11949965B1 (en) 2023-06-23 2024-04-02 Roku, Inc. Media system with presentation area data analysis and segment insertion feature

Also Published As

Publication number Publication date
EP2208346A1 (en) 2010-07-21

Similar Documents

Publication Publication Date Title
US20090088610A1 (en) Measuring Physiological Response to Media for Viewership Modeling
US20090070798A1 (en) System and Method for Detecting Viewer Attention to Media Delivery Devices
EP2208346A1 (en) System and method for detecting viewer attention to media delivery devices
US20240127269A1 (en) Systems and methods providing en mass collection and centralized processing of physiological responses from viewers
KR100913865B1 (en) Apparatus and method for collecting user evaluation regarding contents
JP4335642B2 (en) Viewer reaction information collecting method, user terminal and viewer reaction information providing device used in the viewer reaction information collecting system, and program for creating viewer reaction information used for realizing the user terminal / viewer reaction information providing device
RU2601287C1 (en) Device for creating interest on the part of a viewer viewing the content
US8442578B2 (en) Mobile personal services platform for providing feedback
KR100946222B1 (en) Affective television monitoring and control
JP2023011578A (en) Brain wave measuring device, brain wave measuring method, and brain wave measuring program
WO2009073634A1 (en) Correlating media instance information with physiological responses from participating subjects
US20090247895A1 (en) Apparatus, method, and computer program for adjustment of electroencephalograms distinction method
JP2018159908A (en) Information processing apparatus, information processing system, and program
EP2333778A1 (en) Digital data reproducing apparatus and method for controlling the same
JP2017021737A (en) Program, terminal and system for giving emotional identifier to application by using myoelectrical signal
JP2012005702A (en) Information processing system and information processing apparatus
JP2006020131A (en) Device and method for measuring interest level
Barreda-Ángeles et al. Psychophysiological methods for quality of experience research in virtual reality systems and applications
US11416128B2 (en) Virtual group laughing experience
JP2018093350A (en) Attention degree evaluation system
JP7121937B2 (en) MOVIE GENERATION DEVICE, MOVIE GENERATION/PLAYBACK SYSTEM, MOVIE GENERATION METHOD, MOVIE GENERATION PROGRAM
CN108887961B (en) Seat and seat-based concentration evaluation method
JP2005210155A (en) Mobile viewing apparatus
BG109585A (en) Method and device for registration and identification of the participants in a study of the spectator interest in television and radio programs
TW200920137A (en) Method and mobile device for detecting highlights

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08829304

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2008829304

Country of ref document: EP