WO2005048225A1 - Methods of, and systems for, controlling a display apparatus and related audio output - Google Patents

Methods of, and systems for, controlling a display apparatus and related audio output Download PDF

Info

Publication number
WO2005048225A1
WO2005048225A1 PCT/GB2004/004782 GB2004004782W WO2005048225A1 WO 2005048225 A1 WO2005048225 A1 WO 2005048225A1 GB 2004004782 W GB2004004782 W GB 2004004782W WO 2005048225 A1 WO2005048225 A1 WO 2005048225A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio signal
camera
message
display apparatus
live event
Prior art date
Application number
PCT/GB2004/004782
Other languages
French (fr)
Inventor
Alastair Breward
Original Assignee
Alastair Breward
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alastair Breward filed Critical Alastair Breward
Publication of WO2005048225A1 publication Critical patent/WO2005048225A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for

Definitions

  • This invention relates to methods and systems for controlling a display apparatus for use at live events or televised live events, and for controlling related audio output in such televised events.
  • Live events such as sporting events, frequently attract large audiences, both physically and by means of television broadcast.
  • still media such as picture advertisements are typically displayed at the venue where they will be seen by these audiences.
  • a proportion of these advertisements are not displayed continuously but alternated mechanically or electronically.
  • several advertisements may be printed onto a single loop of material which is then scrolled so that individual advertisements are viewed in turn.
  • three advertisements are depicted in strips along a series of long thin triangular prisms mounted in parallel and rotated synchronously, so that all strips of one advertisement are displayed together in space (so reconstituting the whole image) and time.
  • the multiple advertisements are electronically stored and displayed in turn on a single electronic display device.
  • US patent 4,806,924 describes a dynamic advertising system in which an advertising message is maintained in view of a television camera. Each camera is equipped with an infrared radiation gun aligned with the shooting axis of the associated camera.
  • the dynamic advertising system includes a number of display panels arranged side-by-side, each of which displays a single character of an advertising message to be displayed.
  • a computer system is connected to the display panels to move the advertising message along the display panels in correspondence with the detected movement of the camera, so that the advertising message is constantly in view. It is an object of the invention to increase the effectiveness of display apparatus for advertisements, and other types of display apparatus, at live events, by drawing attention to images displayed and by playing related sound when images are displayed, or both.
  • a method of controlling a display system at a live event comprising a dynamic display apparatus capable of altering a display from the display of a first message to a second, independent, message, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the dynamic display apparatus in response to the monitoring to alter the display from the first message to the second message.
  • the present invention enables advertisers, or those providing advertising space to them, to increase the effectiveness of advertisements provided by means of the display apparatus, by increasing the correlation between (i) the fact that a large number of people are, at a given moment, either (if viewing remotely via television) seeing an image which contains the space where the advertisement is to be displayed or (if present at a venue) looking in the general direction of that space (on the assumption that, broadly, those present will look at the place where the action is, which is also where the cameras will be pointed), and (ii) the incidence of motion in that space which draws the eyes of those people. Uses other than for commercial advertising are also possible.
  • a method of controlling an audio generating system at a live event comprising an audio signal generator, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the audio signal generator in response to the monitoring.
  • This method provides a method of controlling an audio output device so as to deliver audio output which is semantically related to an image element, such as an advertisement message picked up by the camera.
  • the invention provides for the occurrence of sound effects related to the advertisement being (or about to be) displayed on screen to a television audience.
  • the invention also relates to apparatus arranged to perform methods according to each aspect of the invention.
  • pan, tilt and zoom sensors are coupled to one or more television cameras at a live event in order to determine the field of view of the (or each) camera.
  • Other methods to determine the field of view may be used.
  • the field of view information is input to a computer containing data about the positions and orientation of one or more image billboards ("IBs") present at the live event, being used to display images visible to cameras (and usually also to persons physically present at the event).
  • IBs image billboards
  • a billboard which contains two or more printed advertisements or other images on a loop of material, and a device to scroll the material round periodically (or on some cue) so that a different image is visible, thus alternating two or more images in a single space.
  • IBs may also use electronic display devices (either to display a sequence of individually still images, or to display a moving image or a series of them in rotation). Such electronic display devices are capable of displaying different images in dependence upon an image signal input in to the device.
  • Dynamic IBs or DIBs are capable of changing the image displayed.
  • Other IBs are referred to as static IBs or SIBs.
  • the computer is programmed to detect occasions when an IB is visible in the image being recorded by the (or any) camera. Subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to instruct each such IB which is a DIB, when so visible, to change the image it is displaying. Alternatively, or in addition, again subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to effect the inclusion in the soundtrack accompanying any television programme displaying images containing any such IB (such sound track being referred to herein as the soundtrack "related to" that IB or the "related soundtrack”), of semantically related audio material.
  • Examples of special circumstances include: (i) inhibiting or delaying instructions to take account of other events in the camera's field of view such as key game incidents, or (ii) for multi-camera live events, polling a source of information about which camera is 'live' at any given moment.
  • the change in the IB (if a DIB) or in the related soundtrack occurs when the IB is being viewed by the television camera, and hence (in most cases) by a far larger audience than might be viewing it but for its presence in the field of view of one or more cameras. It is generally established that the human eye is drawn to motion and therefore any image which scrolls into view within a viewed scene is more impactful than one already on display in that scene.
  • an IB is a DIB displaying a moving image, or series of them
  • the computer could, instead of effecting a change to another image, be programmed to initiate a change of state within the currently displayed image.
  • Changes to soundtracks may be effected electronically (using either digital or analog technology) by overlaying audio material into the underlying soundtrack being recorded or otherwise prepared for use with the visual images obtained from the television cameras, or they may be effected by feeding an audio signal into loud speakers at the venue so as to form part of the ambient sound in the vicinity of microphones being used to capture the sounds to be or being broadcast, for example, in the vicinity of sports commentators.
  • the track might be words, a jingle, or a sound effect.
  • Figure 1 shows a perspective view of part of a sports arena, and certain physical elements relevant to use of the invention.
  • Figure 2 is a block diagram of a subset of the components which make up a camera sub-system arranged in accordance with an embodiment of the invention.
  • Figure 3 is a block diagram of a subset of the components which make up an IB sub-system arranged in accordance with an embodiment of the invention.
  • Figure 4 is a flow chart describing the operation of television recording and apparatus arranged in accordance with an embodiment of the invention, when at least some of the IBs in use are DIBs and when changes to related soundtracks are to be made. If only SIBs are to be used, with related soundtrack changes, or if DIBs are to be used without soundtrack changes, certain parts of the flow chart are redundant.
  • Figure 5 is a flow chart describing the operation of television recording and apparatus arranged in accordance with a further embodiment of the invention.
  • Figure 1 shows a typical live event venue, in this case a soccer stadium, including a camera and an IB.
  • the invention is equally capable of being applied to motor racing, other racing events, tennis, basketball, golf, athletics and other sports and non-sporting events like award ceremonies, official ceremonies, other ceremonies, firework and other public displays, marches and demonstrations, and in general any event at which IBs are in use. It would also be possible to embed use of the invention in audiovisual works such as dramatic works of fiction or biographical reconstructions. For ease of illustration, it is assumed there are only one camera and only one IB at the live event, but in practice there will generally be a number of each.
  • the camera is fitted with means repeatedly to detect which points in space are at any time visible through that camera.
  • These points form an approximate cone ("view-cone"), whose axis is the line along which the camera is pointing, whose viewing angles (the angles made at the vertex of the view- cone in horizontal and vertical planes, respectively, each such plane passing through the axis of the view-cone) are determined by the design of the camera and the degree of 'zoom' being used by the camera operator, and whose length may be either regarded as infinite or else arbitrarily set to a size greater than the longest distance the camera could view within the venue.
  • Figure 1 shows the view-cone truncated by the ground and by the plane of the IB only for ease of illustration. These truncations are shaded.
  • FIG. 2 is a block diagram of a subset of the components which make up the invention, namely one camera sub-system.
  • Camera 140 has a zoom lens, including a 2X expander (range extender).
  • a 2X expander/zoom/focus sensor 152 (collectively a "zoom sensor") which senses the zoom in the camera, the focal distance of the camera, and whether the 2X expander is being used.
  • the analogue output of sensor 152 is sent to an analogue to digital converter 154, which converts the analogue signal to a digital signal, and transmits the digital signal to processor 156.
  • Camera 140 is mounted on tripod 144 which includes pan and tilt heads that enable broadcast camera 140 to pan and tilt. Attached to tripod 144 are pan sensor 146 and tilt sensor 148, both of which are connected to pan-tilt electronics 150. Alternatively camera 140 can include a built in pan and tilt unit. In either configuration, pan sensor 146, tilt sensor 148 and zoom sensor 152 are considered to be coupled to camera 140 because they can sense data representing the pan, tilt, zoom and focus of broadcast camera 140.
  • Processor 156 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute view-cone data in time for the IB/audio control processor to issue DIB update decisions and soundtrack changes in time for their implementation while the relevant IBs are still being viewed in the associated cameras, as further explained below).
  • Processor 156 also includes memory and a disk drive to store data and software.
  • processor 156 is in communication with a IB/audio control system, which is described below (in relation to Figure 3).
  • pan sensor 146 and tilt sensor 148 are optical encoders that output a signal, measured as a number of clicks, indicating the rotation of a shaft. Forty thousand (40,000) clicks represent a full 360° rotation. Thus, a processor can divide the number of measured clicks by 40,000 and multiply by 360 to determine the pan or tilt angle in degrees.
  • the pan and tilt sensors use standard technology known in the art and can be replaced by other suitable pan and tilt sensors known to those skilled in the relevant art.
  • Pan/tilt electronics receives the output of pan sensor 146 and tilt sensor 148, converts the output to a digital signal (representing pan and tilt) and transmits the digital signal to processor 156. The pan, tilt and zoom sensors are used to determine the view-cone of the camera.
  • one or more of the pan, tilt and zoom sensors can together be labelled as a view-cone sensor.
  • the view-cone sensor would only include a pan sensor.
  • the pan, tilt and zoom sensors may be embodied as described above, but other embodiments are possible. Two examples are taught in International patent application number W098/18261, pp 8-9 starting at line 21, namely (i) marking known locations in the venue such that each mark looks different and one will always be in view to a camera, so that an image thereof is detectable to the processor analysing the image recorded by the camera, or (ii) placing electromagnetic, infra-red or similar emitters around the stadium which are sensed by sensors on the cameras, again enabling computation of relative location and orientation.
  • FIG. 3 is a block diagram of a subset of the components which make up the invention, namely the IB/audio sub-system.
  • Processor 200 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute update decisions and soundtrack changes in time for their implementation while the related IBs are still being viewed in the associated cameras, as further explained below).
  • Processor 200 also includes memory 202, a disk drive 204 to store data and software, a VDU/keyboard/mouse 206, a removable drive 208 or other means to load pre-created data, e.g. defining different camera systems, different venue topographies.
  • the pre- created data would include the location and orientation in space of all IBs (in order to establish whether or not at any given time, an IB is in principle within a view-cone), and also the angle at which an IB is no longer considered to be oriented to a camera, and any occlusions present.
  • Certain IBs which are in principle 'in view' may be facing too obliquely to be seen effectively via the camera, or may be occluded. For example, if an IB is perpendicular to the line of view of a camera (and facing that way) it is clearly in view, and maybe so when at an angle of 45 degree to perpendicular, but not when at an angle of 30 degrees, or behind a known obstacle.
  • the processor 200 is connected to the (or each) camera subsystem 198 to obtain view-cone data and possibly other input. Where any IBs in use are DIBs, processor 200 is also connected to them 210 by means enabling the processor to instruct each DIB to update its display, and ideally also enabling the DIB to inform the processor whenever its display is updated if it has capability to update other than by instruction from the processor. (Where all IBs in use are SIBs, no connection is necessary or useful.) Processor 200 is also connected to the audio output system 212, and to stored information including the audio output to be overlaid or otherwise included in the soundtrack.
  • Processor 200 may also be connected to other systems, such as position sensing systems for specific objects of significance such as sports balls, or systems such as joysticks or customised consoles for inputting manual change commands to over-ride those generated by the processor.
  • Figure 4 is a flow chart describing the operation of the invention, when simplified to comprise only one camera and one IB.
  • step 302 pan, tilt and zoom data is sensed, and the view-cone (that is, its size, shape, location and orientation) is determined in step 304 using the data sensed in step 302 together with other stored data including the length of the view-cone, the positions within the venue of the camera, and the rest angle of view of the camera, and communicated to the IB/audio control system.
  • processor 200 determines whether the IB is within the view-cone of the camera.
  • the processor determines whether the IB is a DIB or a SIB. If the latter, steps 308 to 312 inclusive are skipped.
  • the processor analyses whether there is any over-riding reason not to issue an instruction to the DIB to update its image. The most basic of reasons not to update would be that the processor's record of past updates and timings shows that the DIB has been instructed to update within the last few seconds (where the precise number of seconds is user-determined).
  • the processor has received information from the DIB that it has self-updated recently, in which case at this step the processor updates its own records of past updates and timings.
  • Another reason might be, if the relevant event were a soccer match and the soccer ball contained a position recording device whose output was available to the IB/audio control system, that the soccer ball was in the view-cone, stationery and located on the penalty spot while the DIB was located behind the goal area, so that an update to the DIB would potentially interfere with the concentration of the player taking the penalty kick.
  • object tracking may be accomplished by various technical means, including treating the ball or other object with spectral coatings so that it will reflect (or emit) a distinct signal which can be detected by the camera(s) with a filter or other sensor.
  • step 310 if the answer to whether to veto was negative, then processor 200 instructs the DIB to update, and also updates its own records of past updates and timings.
  • step 312 the processor determines whether a soundtrack change is also to be considered (which is user-determined and available from memory 202 or other media 204). If it is, and not otherwise, in step 314, the processor analyses whether there is any over-riding reason not to issue an instruction to the audio output system to deliver audio content. (An example, where images being displayed are advertisements, might be that advertising is deemed more impactful overall if a related soundtrack change is made only sometimes.
  • step 316 the processor instructs the audio output system to deliver audio output for inclusion in the soundtrack, by overlay or loudspeaker or other technology. Finally, the processor updates its own records or past soundtrack changes and timings.
  • FIG. 5 is a flow chart describing the operation of the invention, when applied to a system comprising multiple cameras and multiple IBs.
  • step 402 pan, tilt and zoom data is sensed for each camera, and the view-cone of each is determined in step 404, and communicated to the IB/audio control system.
  • step 405 processor 200 determines which IBs are within the view-cones of which cameras, and stores this data as an array, and checks that the array is not empty in step 406. If there are no IBs within any view-cones, then the processing ends, and reiterates from step 402.
  • the processor analyses the array to establish an order of priority in which to evaluate those IBs (evaluation meaning, for any IBs are DIBs, deciding whether to instruct update, and for any IBs for which soundtrack changes are enabled, whether to instruct such changes).
  • the analysis is performed based on stored rules and data.
  • the order of priority could be simply arbitrary, but the aim of the stored rules and data would be bring forward in time the consideration of updates to those DIBs, and consideration of soundtrack changes relating to those IBs, most likely to feature in any final television programme and how prominently, and to decide an order for consideration where two or more updates (perhaps by virtue of relating to closely positioned IBs) are held equally likely to feature equally prominently.
  • step 407 ends.
  • the list produced could be of any length, and could comprise a single IB.
  • the processor repeats for each IB in the order of priority, the steps 307 to 316 inclusive taught in the flowchart in figure 4.
  • the processor limits the number of change commands issued, and therefore may suppress further processing part way through the order of priority established in step 407.
  • the processor interrupts itself during list processing to commence processing of a further priority list prepared later in time.
  • processor 200 it becomes desirable for processor 200 to have an information signal as to which cameras is currently 'relevant' meaning that they are recording events either which are being viewed live, or which will be viewed later, or which may be viewed later.
  • processor 200 it may be possible and practicable to supply the computer with precise data about which camera is 'relevant' at every moment throughout an event. Otherwise, it will be practicable to create heuristic rules for use by the computer in assessing how to prioritise cameras.
  • image analysis or other technology might be used to determine whether or not the camera is viewing a specific object (such as a soccer ball) and if so from how far away.
  • a specific object such as a soccer ball
  • DIB changes will occur, and while many will be redundant, many others will be seen. In such cases, it might be appropriate to vary the other instructions given to the computer so as to raise in general the number of 'change' instructions issued to DIBs.
  • each soundtrack change is effected in the audio material specific to the video footage from the camera viewing the relevant IB, as soundtrack changes made to a general soundtrack (to be accompanied by a video track assembled later from footage available from various cameras) cannot be made to relate to viewing of the relevant IBs.
  • the computer can take note of the speed with which a camera is moving (panning and tilting) and to instruct the computer to inhibit IB update and/or soundtrack change instructions relating to IBs in view to that camera, if it is predictable that the IB will be out of view (or partly out of view, or about to be out of view) by the time when it actions the instruction to change, or if the relevant images require more time for digestion by the audience. It is feasible, when using this invention, to operate new forms of advertising, in which the message to be communicated to the audience relies on the fact that IB updates and/or soundtrack changes can be linked to events or otherwise controlled.
  • Advertising is a highly creative field, but examples of how this might occur include: (i) At a soccer event, there are DIBs around the ground. One sponsor of a soccer team reserves all image spaces ("panels") on certain DIBs near the goal areas, on all save one of which appear a neutral slogan, and on the last panel is a message thought suitable for display when the sponsored team is moving forward in attack. (During each half, the 'attack' message near the team's own goal area is unused.) When the team surges forward, the audience is shown the 'attack' message, which is therefore additionally impactful because synchronised with suitably reinforcing live events.
  • Advertisements sharing one display device would no longer be displayed for fixed periods.
  • This invention involves information processing relating both to camera/audio-recording systems (which are typically owned and operated by television and broadcast entities) and IB systems (which are typically owned and operated by sports clubs or venue operators). Given that much of the specific pre-programmed information required relates to the physical characteristics of the venue and the kinds of event which take place there, it would seem prima facie simpler for venue operators to control the computing functions, and receive feeds from cameras recording events on specific occasions, and this is the approach taken in this description, i.e. where multiple cameras and IBs exist, the cameras all singly feed data to a unitary IB/audio control system.
  • an individual element of advertising or other visual material could involve motion, for example it could be a short sequence of video footage (displayed on an electronic display device, or by projection on a screen), or a sequence of two or more still images, intended to be displayed in order. It is feasible to use the invention with mobile IBs. In such a case, the processor 200 needs to receive information about the position and orientation of IBs as this data changes. It is usual but not essential that the same IBs are visible to the physically present audience and to the television audience. It would be feasible in some circumstances to position IBs so that they are capable of being seen only by the cameras.

Abstract

Pan, tilt and zoom sensors are coupled to one or more television cameras at a live event in order to determine the field of view of the (or each) camera. This information is input to a computer containing data about the positions and orientation of one or more image billboards ('IBs') present at the live event, being used to display advertising or other images visible to cameras. The computer is programmed to detect occasions when an IB is visible in the image being recorded by the (or any) camera. Subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to (i) in respect of any IBs which are capable of changing their displays, instruct each such IBs, when so visible, to change the image it is displaying, and/or (ii) in respect of any IBs for which associated audio material is available, to introduce that audio material into the soundtrack which is to accompany the video data being captured by the cameras.

Description

Methods of, and Systems for. Controlling a Display Apparatus and Related Audio Output
Field of the Invention This invention relates to methods and systems for controlling a display apparatus for use at live events or televised live events, and for controlling related audio output in such televised events.
Background of the Invention Live events, such as sporting events, frequently attract large audiences, both physically and by means of television broadcast. For this reason, still media such as picture advertisements are typically displayed at the venue where they will be seen by these audiences. A proportion of these advertisements are not displayed continuously but alternated mechanically or electronically. For example, several advertisements may be printed onto a single loop of material which is then scrolled so that individual advertisements are viewed in turn. Alternatively, three advertisements are depicted in strips along a series of long thin triangular prisms mounted in parallel and rotated synchronously, so that all strips of one advertisement are displayed together in space (so reconstituting the whole image) and time. Alternatively, the multiple advertisements are electronically stored and displayed in turn on a single electronic display device. In part this allows sharing of the one physical venue, and in part the technique draws the attention of the audience, since it is well known that the human eye is drawn to moving elements in a relatively static background, and hence the advertisements are more noticed and therefore more impactful. It is assumed that on average through the period of the event, a certain number of people will be looking sufficiently near to the display device for any motion on it to catch their attention and so cause them to see the advertisement. US patent 4,806,924 describes a dynamic advertising system in which an advertising message is maintained in view of a television camera. Each camera is equipped with an infrared radiation gun aligned with the shooting axis of the associated camera. The dynamic advertising system includes a number of display panels arranged side-by-side, each of which displays a single character of an advertising message to be displayed. A computer system is connected to the display panels to move the advertising message along the display panels in correspondence with the detected movement of the camera, so that the advertising message is constantly in view. It is an object of the invention to increase the effectiveness of display apparatus for advertisements, and other types of display apparatus, at live events, by drawing attention to images displayed and by playing related sound when images are displayed, or both.
Summary of the Invention In accordance with one aspect of the invention, there is provided a method of controlling a display system at a live event, the system comprising a dynamic display apparatus capable of altering a display from the display of a first message to a second, independent, message, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the dynamic display apparatus in response to the monitoring to alter the display from the first message to the second message. The present invention enables advertisers, or those providing advertising space to them, to increase the effectiveness of advertisements provided by means of the display apparatus, by increasing the correlation between (i) the fact that a large number of people are, at a given moment, either (if viewing remotely via television) seeing an image which contains the space where the advertisement is to be displayed or (if present at a venue) looking in the general direction of that space (on the assumption that, broadly, those present will look at the place where the action is, which is also where the cameras will be pointed), and (ii) the incidence of motion in that space which draws the eyes of those people. Uses other than for commercial advertising are also possible. In accordance with a further aspect of the invention, there is provided a method of controlling an audio generating system at a live event, the system comprising an audio signal generator, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the audio signal generator in response to the monitoring. This method provides a method of controlling an audio output device so as to deliver audio output which is semantically related to an image element, such as an advertisement message picked up by the camera. Thus, the invention provides for the occurrence of sound effects related to the advertisement being (or about to be) displayed on screen to a television audience. The invention also relates to apparatus arranged to perform methods according to each aspect of the invention. In preferred embodiments of the invention, pan, tilt and zoom sensors are coupled to one or more television cameras at a live event in order to determine the field of view of the (or each) camera. Other methods to determine the field of view may be used. The field of view information is input to a computer containing data about the positions and orientation of one or more image billboards ("IBs") present at the live event, being used to display images visible to cameras (and usually also to persons physically present at the event). One example of an IB is a billboard which contains two or more printed advertisements or other images on a loop of material, and a device to scroll the material round periodically (or on some cue) so that a different image is visible, thus alternating two or more images in a single space. As another example, three images can be depicted in strips along a series of long thin triangular prisms mounted in parallel and rotated synchronously, so that all strips of one image are displayed together in space (so reconstituting the whole image) and time. IBs may also use electronic display devices (either to display a sequence of individually still images, or to display a moving image or a series of them in rotation). Such electronic display devices are capable of displaying different images in dependence upon an image signal input in to the device. Collectively these three and other kinds of IB which are capable of changing the image displayed are referred to as Dynamic IBs or DIBs. Other IBs are referred to as static IBs or SIBs. The computer is programmed to detect occasions when an IB is visible in the image being recorded by the (or any) camera. Subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to instruct each such IB which is a DIB, when so visible, to change the image it is displaying. Alternatively, or in addition, again subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to effect the inclusion in the soundtrack accompanying any television programme displaying images containing any such IB (such sound track being referred to herein as the soundtrack "related to" that IB or the "related soundtrack"), of semantically related audio material. Examples of special circumstances include: (i) inhibiting or delaying instructions to take account of other events in the camera's field of view such as key game incidents, or (ii) for multi-camera live events, polling a source of information about which camera is 'live' at any given moment. The change in the IB (if a DIB) or in the related soundtrack occurs when the IB is being viewed by the television camera, and hence (in most cases) by a far larger audience than might be viewing it but for its presence in the field of view of one or more cameras. It is generally established that the human eye is drawn to motion and therefore any image which scrolls into view within a viewed scene is more impactful than one already on display in that scene. Where an IB is a DIB displaying a moving image, or series of them, the computer could, instead of effecting a change to another image, be programmed to initiate a change of state within the currently displayed image. Changes to soundtracks may be effected electronically (using either digital or analog technology) by overlaying audio material into the underlying soundtrack being recorded or otherwise prepared for use with the visual images obtained from the television cameras, or they may be effected by feeding an audio signal into loud speakers at the venue so as to form part of the ambient sound in the vicinity of microphones being used to capture the sounds to be or being broadcast, for example, in the vicinity of sports commentators. The track might be words, a jingle, or a sound effect. For example, every time (or at least, on some occasions) when a fizzy drink advertisement comes into view (whether scrolling or not), there is a background sound of a bottle opening and fizzing enticingly. All technology required to effect soundtrack changes by either means are well known and previously taught. Brief Description of the Drawings Figure 1 shows a perspective view of part of a sports arena, and certain physical elements relevant to use of the invention. Figure 2 is a block diagram of a subset of the components which make up a camera sub-system arranged in accordance with an embodiment of the invention. Figure 3 is a block diagram of a subset of the components which make up an IB sub-system arranged in accordance with an embodiment of the invention. Figure 4 is a flow chart describing the operation of television recording and apparatus arranged in accordance with an embodiment of the invention, when at least some of the IBs in use are DIBs and when changes to related soundtracks are to be made. If only SIBs are to be used, with related soundtrack changes, or if DIBs are to be used without soundtrack changes, certain parts of the flow chart are redundant. Figure 5 is a flow chart describing the operation of television recording and apparatus arranged in accordance with a further embodiment of the invention.
Detailed Description of Preferred Embodiments of the Invention
A first embodiment A first embodiment of the invention will now be described. Figure 1 shows a typical live event venue, in this case a soccer stadium, including a camera and an IB. The invention is equally capable of being applied to motor racing, other racing events, tennis, basketball, golf, athletics and other sports and non-sporting events like award ceremonies, official ceremonies, other ceremonies, firework and other public displays, marches and demonstrations, and in general any event at which IBs are in use. It would also be possible to embed use of the invention in audiovisual works such as dramatic works of fiction or biographical reconstructions. For ease of illustration, it is assumed there are only one camera and only one IB at the live event, but in practice there will generally be a number of each. The camera is fitted with means repeatedly to detect which points in space are at any time visible through that camera. These points form an approximate cone ("view-cone"), whose axis is the line along which the camera is pointing, whose viewing angles (the angles made at the vertex of the view- cone in horizontal and vertical planes, respectively, each such plane passing through the axis of the view-cone) are determined by the design of the camera and the degree of 'zoom' being used by the camera operator, and whose length may be either regarded as infinite or else arbitrarily set to a size greater than the longest distance the camera could view within the venue. Figure 1 shows the view-cone truncated by the ground and by the plane of the IB only for ease of illustration. These truncations are shaded. In this embodiment of the invention, both the dimensions and orientation of the camera's view-cone are monitored continuously by frequent measurement of relevant data and re-computation. Figure 2 is a block diagram of a subset of the components which make up the invention, namely one camera sub-system. Camera 140 has a zoom lens, including a 2X expander (range extender). Connected to camera 140 is a 2X expander/zoom/focus sensor 152 (collectively a "zoom sensor") which senses the zoom in the camera, the focal distance of the camera, and whether the 2X expander is being used. The analogue output of sensor 152 is sent to an analogue to digital converter 154, which converts the analogue signal to a digital signal, and transmits the digital signal to processor 156. One alternative includes using a zoom sensor with a digital output, which would remove the need for analogue to digital conversion. Camera 140 is mounted on tripod 144 which includes pan and tilt heads that enable broadcast camera 140 to pan and tilt. Attached to tripod 144 are pan sensor 146 and tilt sensor 148, both of which are connected to pan-tilt electronics 150. Alternatively camera 140 can include a built in pan and tilt unit. In either configuration, pan sensor 146, tilt sensor 148 and zoom sensor 152 are considered to be coupled to camera 140 because they can sense data representing the pan, tilt, zoom and focus of broadcast camera 140. Processor 156 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute view-cone data in time for the IB/audio control processor to issue DIB update decisions and soundtrack changes in time for their implementation while the relevant IBs are still being viewed in the associated cameras, as further explained below). Processor 156 also includes memory and a disk drive to store data and software. In addition to being in communication with pan-tilt electronics 150 and analogue to digital converter 154, processor 156 is in communication with a IB/audio control system, which is described below (in relation to Figure 3). In one embodiment, pan sensor 146 and tilt sensor 148 are optical encoders that output a signal, measured as a number of clicks, indicating the rotation of a shaft. Forty thousand (40,000) clicks represent a full 360° rotation. Thus, a processor can divide the number of measured clicks by 40,000 and multiply by 360 to determine the pan or tilt angle in degrees. The pan and tilt sensors use standard technology known in the art and can be replaced by other suitable pan and tilt sensors known to those skilled in the relevant art. Pan/tilt electronics receives the output of pan sensor 146 and tilt sensor 148, converts the output to a digital signal (representing pan and tilt) and transmits the digital signal to processor 156. The pan, tilt and zoom sensors are used to determine the view-cone of the camera. Thus one or more of the pan, tilt and zoom sensors can together be labelled as a view-cone sensor. For example, if a camera cannot tilt or zoom, the view-cone sensor would only include a pan sensor. The pan, tilt and zoom sensors may be embodied as described above, but other embodiments are possible. Two examples are taught in International patent application number W098/18261, pp 8-9 starting at line 21, namely (i) marking known locations in the venue such that each mark looks different and one will always be in view to a camera, so that an image thereof is detectable to the processor analysing the image recorded by the camera, or (ii) placing electromagnetic, infra-red or similar emitters around the stadium which are sensed by sensors on the cameras, again enabling computation of relative location and orientation. Where multiple cameras are used, it may be practicable to reduce the number of processors used below the number of cameras, so that for example, a single processor 156 is linked to multiple cameras and computes the view-cones for all of them. At some venues, cameras may be mobile, for example mounted on straight, delimited tracks parallel to the field of play. In such cases, it will be necessary to track the location of the camera mounting on the track, and take account of this when computing the location and orientation of the view-cone. Figure 3 is a block diagram of a subset of the components which make up the invention, namely the IB/audio sub-system. Processor 200 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute update decisions and soundtrack changes in time for their implementation while the related IBs are still being viewed in the associated cameras, as further explained below). Processor 200 also includes memory 202, a disk drive 204 to store data and software, a VDU/keyboard/mouse 206, a removable drive 208 or other means to load pre-created data, e.g. defining different camera systems, different venue topographies. In more detail, the pre- created data would include the location and orientation in space of all IBs (in order to establish whether or not at any given time, an IB is in principle within a view-cone), and also the angle at which an IB is no longer considered to be oriented to a camera, and any occlusions present. Certain IBs which are in principle 'in view' may be facing too obliquely to be seen effectively via the camera, or may be occluded. For example, if an IB is perpendicular to the line of view of a camera (and facing that way) it is clearly in view, and maybe so when at an angle of 45 degree to perpendicular, but not when at an angle of 30 degrees, or behind a known obstacle. Collectively, this processor and peripherals may be called the IB/audio control system. The processor 200 is connected to the (or each) camera subsystem 198 to obtain view-cone data and possibly other input. Where any IBs in use are DIBs, processor 200 is also connected to them 210 by means enabling the processor to instruct each DIB to update its display, and ideally also enabling the DIB to inform the processor whenever its display is updated if it has capability to update other than by instruction from the processor. (Where all IBs in use are SIBs, no connection is necessary or useful.) Processor 200 is also connected to the audio output system 212, and to stored information including the audio output to be overlaid or otherwise included in the soundtrack. Processor 200 may also be connected to other systems, such as position sensing systems for specific objects of significance such as sports balls, or systems such as joysticks or customised consoles for inputting manual change commands to over-ride those generated by the processor. Figure 4 is a flow chart describing the operation of the invention, when simplified to comprise only one camera and one IB. In step 302, pan, tilt and zoom data is sensed, and the view-cone (that is, its size, shape, location and orientation) is determined in step 304 using the data sensed in step 302 together with other stored data including the length of the view-cone, the positions within the venue of the camera, and the rest angle of view of the camera, and communicated to the IB/audio control system. In step 306, processor 200 determines whether the IB is within the view-cone of the camera. In step 307, the processor determines whether the IB is a DIB or a SIB. If the latter, steps 308 to 312 inclusive are skipped. In step 308, if the IB (always being a DIB) is within the view-cone, the processor analyses whether there is any over-riding reason not to issue an instruction to the DIB to update its image. The most basic of reasons not to update would be that the processor's record of past updates and timings shows that the DIB has been instructed to update within the last few seconds (where the precise number of seconds is user-determined). Another reason would be if the processor has received information from the DIB that it has self-updated recently, in which case at this step the processor updates its own records of past updates and timings. Another reason might be, if the relevant event were a soccer match and the soccer ball contained a position recording device whose output was available to the IB/audio control system, that the soccer ball was in the view-cone, stationery and located on the penalty spot while the DIB was located behind the goal area, so that an update to the DIB would potentially interfere with the concentration of the player taking the penalty kick. Such object tracking may be accomplished by various technical means, including treating the ball or other object with spectral coatings so that it will reflect (or emit) a distinct signal which can be detected by the camera(s) with a filter or other sensor. In step 310, if the answer to whether to veto was negative, then processor 200 instructs the DIB to update, and also updates its own records of past updates and timings. In step 312, the processor determines whether a soundtrack change is also to be considered (which is user-determined and available from memory 202 or other media 204). If it is, and not otherwise, in step 314, the processor analyses whether there is any over-riding reason not to issue an instruction to the audio output system to deliver audio content. (An example, where images being displayed are advertisements, might be that advertising is deemed more impactful overall if a related soundtrack change is made only sometimes. Another example would be where a soundtrack change recently instructed is still running.) In step 316, the processor instructs the audio output system to deliver audio output for inclusion in the soundtrack, by overlay or loudspeaker or other technology. Finally, the processor updates its own records or past soundtrack changes and timings.
Variations and enhancements to the first embodiment Figure 5 is a flow chart describing the operation of the invention, when applied to a system comprising multiple cameras and multiple IBs. In step 402, pan, tilt and zoom data is sensed for each camera, and the view-cone of each is determined in step 404, and communicated to the IB/audio control system. In step 405, processor 200 determines which IBs are within the view-cones of which cameras, and stores this data as an array, and checks that the array is not empty in step 406. If there are no IBs within any view-cones, then the processing ends, and reiterates from step 402. In step 407, the processor analyses the array to establish an order of priority in which to evaluate those IBs (evaluation meaning, for any IBs are DIBs, deciding whether to instruct update, and for any IBs for which soundtrack changes are enabled, whether to instruct such changes). The analysis is performed based on stored rules and data. The order of priority could be simply arbitrary, but the aim of the stored rules and data would be bring forward in time the consideration of updates to those DIBs, and consideration of soundtrack changes relating to those IBs, most likely to feature in any final television programme and how prominently, and to decide an order for consideration where two or more updates (perhaps by virtue of relating to closely positioned IBs) are held equally likely to feature equally prominently. This analysis may involve assigning values to the different cameras, possibly based on real-time data input from the audiovisual editing studio as to which cameras are 'live', and possibly from stored heuristic rules as further described below in the next paragraph. The analysis can if desired also take account of DIB update history, and of soundtrack change history. Once an order of priority is established, step 407 ends. The list produced could be of any length, and could comprise a single IB. In step 408, the processor repeats for each IB in the order of priority, the steps 307 to 316 inclusive taught in the flowchart in figure 4. In one embodiment, the processor limits the number of change commands issued, and therefore may suppress further processing part way through the order of priority established in step 407. In another embodiment, the processor interrupts itself during list processing to commence processing of a further priority list prepared later in time. Where multiple cameras are in use, it becomes desirable for processor 200 to have an information signal as to which cameras is currently 'relevant' meaning that they are recording events either which are being viewed live, or which will be viewed later, or which may be viewed later. In certain circumstances (such as events broadcast live, via a set of cameras under the unitary control of one editor, employing systems to switch between cameras which can output electronic data to the computer), it may be possible and practicable to supply the computer with precise data about which camera is 'relevant' at every moment throughout an event. Otherwise, it will be practicable to create heuristic rules for use by the computer in assessing how to prioritise cameras. These rules in a simple form might assign higher relevance to cameras in specific (e.g. nearer, more central, etc) locations, and/or to cameras which are moving (panning and tilting) at certain rates or in certain patterns. For example, if the event is a soccer match, certain events like corner kicks would generate camera movements which follow a discernable pattern (that is several cameras are demonstrably tracking the same object, and that object is travelling in from the corner toward the goal-mouth). In such cases, it is possible to identify one camera as the most commonly televised, and assign higher priority to that camera. It might also be the case that such patterns indicate that a moderately high relevance should be assigned to numerous other cameras, because the recordings of those other cameras will be shown as well, as action replays. In more complex form, image analysis or other technology might be used to determine whether or not the camera is viewing a specific object (such as a soccer ball) and if so from how far away. Although it is desirable in multi-camera systems to have good information about the relative (and absolute) priorities of the cameras over time, it is - in most configurations as to the number of cameras etc - not fatal to the usefulness of the invention, because a positive effect on impactfulness of images will still occur even if the computer cannot determine precisely which footage will eventually be seen by the audience. DIB changes will occur, and while many will be redundant, many others will be seen. In such cases, it might be appropriate to vary the other instructions given to the computer so as to raise in general the number of 'change' instructions issued to DIBs. The same analysis applies where the invention is used to make soundtrack changes, although it may be important in such cases that each soundtrack change is effected in the audio material specific to the video footage from the camera viewing the relevant IB, as soundtrack changes made to a general soundtrack (to be accompanied by a video track assembled later from footage available from various cameras) cannot be made to relate to viewing of the relevant IBs. It is feasible for the computer to take note of the speed with which a camera is moving (panning and tilting) and to instruct the computer to inhibit IB update and/or soundtrack change instructions relating to IBs in view to that camera, if it is predictable that the IB will be out of view (or partly out of view, or about to be out of view) by the time when it actions the instruction to change, or if the relevant images require more time for digestion by the audience. It is feasible, when using this invention, to operate new forms of advertising, in which the message to be communicated to the audience relies on the fact that IB updates and/or soundtrack changes can be linked to events or otherwise controlled. Advertising is a highly creative field, but examples of how this might occur include: (i) At a soccer event, there are DIBs around the ground. One sponsor of a soccer team reserves all image spaces ("panels") on certain DIBs near the goal areas, on all save one of which appear a neutral slogan, and on the last panel is a message thought suitable for display when the sponsored team is moving forward in attack. (During each half, the 'attack' message near the team's own goal area is unused.) When the team surges forward, the audience is shown the 'attack' message, which is therefore additionally impactful because synchronised with suitably reinforcing live events. (An example of an 'attack' message might be a slogan advertising the accuracy afforded to aiming by the team's footwear sponsor's boots.) (ii) Where two IBs are frequently in view together (to the same camera) or are frequently or invariably seen in a certain sequence, their changes may be co-ordinated (i.e. synchronised, by which we mean in time with each other or with a controlled lag between them) to relay a particular message. When the invention is used, it will become possible to introduce new methods of pricing or quantifying advertising which more accurately reflect the benefits given or received. For example, it will be possible to count 'mass- audience visibility' or even 'mass-audience displays', and sell these instead of seconds of 'in principle' visibility. Advertisements sharing one display device would no longer be displayed for fixed periods. This invention involves information processing relating both to camera/audio-recording systems (which are typically owned and operated by television and broadcast entities) and IB systems (which are typically owned and operated by sports clubs or venue operators). Given that much of the specific pre-programmed information required relates to the physical characteristics of the venue and the kinds of event which take place there, it would seem prima facie simpler for venue operators to control the computing functions, and receive feeds from cameras recording events on specific occasions, and this is the approach taken in this description, i.e. where multiple cameras and IBs exist, the cameras all singly feed data to a unitary IB/audio control system. There is no need for the invention to be exploited this way, and it could be operated either independently or with the central control unit perceived as an adjunct to the camera and recording technologies, from which (where the IBs are DIBs) messages are sent as required to individual DIBs. It is feasible that the computer-automated functions of determining when to make IB updates and/or soundtrack changes could be over-ridden manually for particular reasons. Thus a person watching the image from one or more cameras could intervene and control updates/changes directly. For example, where a DIB change behind a goal in a soccer game is deferred while a penalty kick is taken, and then permitted, it would be advantageous (as a further example of inventive advertising) to select between two or more alternative messages to be displayed in such situations. While in principle, the determination as to where the penalty resulted in a goal or not could be automated (based on noise levels, image analysis, data-feed from referee, etc), it might be simpler to allow an individual to make an instant decision and activate a change to the DIBs behind the goal which differs according to the penalty outcome. In an international match where an advertisement or other image is primarily aimed at one nation, that image will perhaps be more impactful if linked to a successful event (score or save) than the reverse. Analogously, human intervention might be appropriate where the soundtracks to be played include the catchphrases of a cartoon or other character being used in a merchandising role, to match the catchphrases appropriately to events. It is usual but not essential for each individual image to be a still image. It is quite feasible that an individual element of advertising or other visual material could involve motion, for example it could be a short sequence of video footage (displayed on an electronic display device, or by projection on a screen), or a sequence of two or more still images, intended to be displayed in order. It is feasible to use the invention with mobile IBs. In such a case, the processor 200 needs to receive information about the position and orientation of IBs as this data changes. It is usual but not essential that the same IBs are visible to the physically present audience and to the television audience. It would be feasible in some circumstances to position IBs so that they are capable of being seen only by the cameras. Note that, whilst the above description refers in places to advertising media, the invention may be applied to other types of media, such as information display boards, laser projection apparatus, public address systems, etc. Further features and advantages of the invention will become apparent from the following description of preferred embodiments of the invention, given by way of example only, which is made with reference to the accompanying drawings.

Claims

Claims
1. A method of controlling a display system at a live event, the system comprising a dynamic display apparatus capable of altering a display from the display of a first message to a second, independent, message, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the dynamic display apparatus in response to the monitoring to alter the display from the first message to the second message
2. A method according to claim 1, wherein the dynamic display apparatus comprises at least two different printed images, which are included in said first and second messages, and which are selectively displayable to alter the display from the first message to the second message.
3. A method according to claim 2, wherein the dynamic display apparatus includes a mechanical display in which said printed images are printed on a loop of material which is scrolled in response to the monitoring.
4. A method according to claim 2, wherein the dynamic display apparatus includes a mechanical display in which said printed images are printed on triangular prisms which is rotated in response to the monitoring.
5. A method according to claim 1, wherein the dynamic display apparatus comprises an electronic display device capable of displaying at least two different images, which are included in said first and second messages, and which are displayed in dependence upon an image signal input into the device.
6. A method according to any preceding claim, wherein the display apparatus comprises first and second dynamic display devices, and the method comprises controlling the first and second dynamic display devices so as to synchronise changes in displays provided by the respective devices.
7. A method according to any preceding claim, wherein said signals indicative of characteristics of a field of view of a camera are generated by one or more detectors connected to the camera.
8. A method according to claim 7, wherein said detectors include one or more detectors mechanically connected to the camera.
9. A method according to claim 7, wherein said detectors include one or more detectors electrically connected to the camera.
10. A method according to any of claims 7 to 9, wherein said detectors include a zoom sensor.
11. A method according to any of claims 7 to 10, wherein said detectors include a tilt sensor.
12. A method according to any of claims 7 to 11, wherein said detectors include a pan sensor.
13. A method according to any preceding claim, wherein the system comprises an audio signal generator, and the method comprises controlling the audio signal generator so as to synchronise changes in an audio signal provided by the audio signal generator with changes in a display provided by the dynamic display apparatus.
14. A method according to claim 13, wherein the method comprises recording an audio signal at the live event and combining the audio signal output by the audio signal generator with the recorded audio signal.
15. A method according to claim 14, wherein the method comprises outputting the audio signal using a loudspeaker at the live event.
16. Display apparatus for use at a live event, the apparatus comprising a dynamic display apparatus capable of altering a display from the display of a first message to a second, independent, message, the apparatus being arranged to monitor signals indicative of characteristics of a field of view of a camera at the live event, and to control the dynamic display apparatus in response to the monitoring to alter the display from the first message to the second message
17. A method of controlling an audio generating system at a live event, the system comprising an audio signal generator, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the audio signal generator in response to the monitoring.
18. A method according to claim 17, wherein the system comprises a dynamic display apparatus, the method comprising controlling the audio signal generator so as to synchronise changes in an audio signal provided by the audio signal generator with changes in a display provided by the dynamic display apparatus.
19. A method according to claim 18, the method comprising controlling the dynamic display apparatus in response to the monitoring.
20. A method according to claim 19, wherein the a dynamic display apparatus is capable of altering a display from the display of a first message to a second, independent, message, the method comprising controlling the dynamic display apparatus in response to the monitoring to alter the display from the first message to the second message.
21. A method according to any of claims 17 to 20, wherein the monitoring signals indicative of characteristics of a field of view of a camera at the live event are indicative of whether an advertising image is in the field of view of the camera.
22. A method according to claims 17 and 21, wherein the advertising image is a static image.
23. A method according to any of claims 17 to 22, wherein the method comprises recording an audio signal at the live event and combining the audio signal output by the audio signal generator with the recorded audio signal.
24. A method according to any of claims 17 to 22, wherein the method comprises outputting the audio signal using a loudspeaker at the live event.
25. An audio generating system for use at a live event, the system comprising an audio signal generator, the system being arranged to monitor signals indicative of characteristics of a field of view of a camera at the live event, and to control the audio signal generator in response to the monitoring.
PCT/GB2004/004782 2003-11-12 2004-11-12 Methods of, and systems for, controlling a display apparatus and related audio output WO2005048225A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0326391.0 2003-11-12
GB0326391A GB2408164A (en) 2003-11-12 2003-11-12 Controlling a dynamic display apparatus

Publications (1)

Publication Number Publication Date
WO2005048225A1 true WO2005048225A1 (en) 2005-05-26

Family

ID=29726414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2004/004782 WO2005048225A1 (en) 2003-11-12 2004-11-12 Methods of, and systems for, controlling a display apparatus and related audio output

Country Status (2)

Country Link
GB (1) GB2408164A (en)
WO (1) WO2005048225A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9373123B2 (en) 2009-12-30 2016-06-21 Iheartmedia Management Services, Inc. Wearable advertising ratings methods and systems
US9047256B2 (en) 2009-12-30 2015-06-02 Iheartmedia Management Services, Inc. System and method for monitoring audience in response to signage
FR2959339A1 (en) * 2010-04-26 2011-10-28 Citiled METHOD FOR CONTROLLING AT LEAST ONE DISPLAY PANEL OF VARIABLE IMAGES IN A PLACE SUCH AS A STAGE
DE102016119639A1 (en) 2016-10-14 2018-04-19 Uniqfeed Ag System for dynamic contrast maximization between foreground and background in images or / and image sequences
DE102016119637A1 (en) 2016-10-14 2018-04-19 Uniqfeed Ag Television transmission system for generating enriched images
DE102016119640A1 (en) 2016-10-14 2018-04-19 Uniqfeed Ag System for generating enriched images

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806924A (en) * 1984-06-29 1989-02-21 Daniel Giraud Method and system for displaying information
US5003293A (en) * 1989-10-02 1991-03-26 Compunic Electronics Co., Ltd. Billboard with audio message spreading function
JPH07312712A (en) * 1994-05-19 1995-11-28 Sanyo Electric Co Ltd Video camera and reproducing device
US5510828A (en) * 1994-03-01 1996-04-23 Lutterbach; R. Steven Interactive video display system
JPH08171413A (en) * 1994-12-16 1996-07-02 Fuji Facom Corp Monitor controller using video and audio
FR2730837A1 (en) * 1995-02-22 1996-08-23 Sciamma Dominique Real=time advertising insertion system for television signal
US5562459A (en) * 1994-01-07 1996-10-08 Durlach; David M. Dynamic three dimenional amusement and display device
WO1998024242A1 (en) * 1996-11-27 1998-06-04 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
KR20010074218A (en) * 2001-04-11 2001-08-04 김도균 A Sign Board to say with Voice storeage system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4814800A (en) * 1988-03-16 1989-03-21 Joshua F. Lavinsky Light show projector
JPH04129062A (en) * 1990-09-18 1992-04-30 Fujitsu Ltd Video and audio simultaneous output device
US5903317A (en) * 1993-02-14 1999-05-11 Orad Hi-Tech Systems Ltd. Apparatus and method for detecting, identifying and incorporating advertisements in a video
US5461596A (en) * 1993-10-26 1995-10-24 Eastman Kodak Company Portfolio photo CD visual/audio display system
JPH07199879A (en) * 1993-12-28 1995-08-04 Hitachi Eng Co Ltd Device and method for sound corresponded image display
IL108957A (en) * 1994-03-14 1998-09-24 Scidel Technologies Ltd System for implanting an image into a video stream
AU7159796A (en) * 1996-03-13 1997-10-01 Howard W. Regen Improved audio-visual sign
JP2000341642A (en) * 1999-05-25 2000-12-08 Hitachi Ltd Synchronization processor for digital picture and sound data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806924A (en) * 1984-06-29 1989-02-21 Daniel Giraud Method and system for displaying information
US5003293A (en) * 1989-10-02 1991-03-26 Compunic Electronics Co., Ltd. Billboard with audio message spreading function
US5562459A (en) * 1994-01-07 1996-10-08 Durlach; David M. Dynamic three dimenional amusement and display device
US5510828A (en) * 1994-03-01 1996-04-23 Lutterbach; R. Steven Interactive video display system
JPH07312712A (en) * 1994-05-19 1995-11-28 Sanyo Electric Co Ltd Video camera and reproducing device
JPH08171413A (en) * 1994-12-16 1996-07-02 Fuji Facom Corp Monitor controller using video and audio
FR2730837A1 (en) * 1995-02-22 1996-08-23 Sciamma Dominique Real=time advertising insertion system for television signal
WO1998024242A1 (en) * 1996-11-27 1998-06-04 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
KR20010074218A (en) * 2001-04-11 2001-08-04 김도균 A Sign Board to say with Voice storeage system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1996, no. 03 29 March 1996 (1996-03-29) *
PATENT ABSTRACTS OF JAPAN vol. 1996, no. 11 29 November 1996 (1996-11-29) *

Also Published As

Publication number Publication date
GB2408164A (en) 2005-05-18
GB0326391D0 (en) 2003-12-17

Similar Documents

Publication Publication Date Title
TWI701628B (en) Display control system and display control method for live broadcast
US10171754B2 (en) Overlay non-video content on a mobile device
US11087135B2 (en) Virtual trading card and augmented reality movie system
US9143699B2 (en) Overlay non-video content on a mobile device
JP5908535B2 (en) Supplemental video content displayed on mobile devices
JP4230999B2 (en) Video-operated interactive environment
US5917553A (en) Method and apparatus for enhancing the broadcast of a live event
CN108886583B (en) System and method for providing virtual pan-tilt-zoom, PTZ, video functionality to multiple users over a data network
US9386339B2 (en) Tagging product information
US6269173B1 (en) Instant response broadcast board system and method
US7868914B2 (en) Video event statistic tracking system
US20150189243A1 (en) Automated video production system
CN101324945A (en) Advertisement selection method and system for determining time quantity of player for consumer to view advertisement
JP2006505330A5 (en)
CA2980501C (en) Method and system for presenting game-related information
JP4934094B2 (en) Sports video transmission device
US8587667B2 (en) Beyond field-of-view tracked object positional indicators for television event directors and camera operators
WO2005048225A1 (en) Methods of, and systems for, controlling a display apparatus and related audio output
US20110141359A1 (en) In-Program Trigger of Video Content
WO2005076598A1 (en) An intelligent method and an intelligent system for integrated tv messaging or advertising
US20230353717A1 (en) Image processing system, image processing method, and storage medium
EP2923485B1 (en) Automated filming process for sport events
KR102195053B1 (en) Television broadcast system for generating augmented images
JP2007049661A (en) Advertisement apparatus and method employing retrieval technique in moving image distribution service
WO2001082195A1 (en) Systems and methods for integrating virtual advertisements into recreated events

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC. FORM 1205A DATED 16.10.2006

122 Ep: pct application non-entry in european phase