WO2008145952A1 - Apparatus and method of body characterisation - Google Patents

Apparatus and method of body characterisation Download PDF

Info

Publication number
WO2008145952A1
WO2008145952A1 PCT/GB2008/001464 GB2008001464W WO2008145952A1 WO 2008145952 A1 WO2008145952 A1 WO 2008145952A1 GB 2008001464 W GB2008001464 W GB 2008001464W WO 2008145952 A1 WO2008145952 A1 WO 2008145952A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
computer
light
goods
Prior art date
Application number
PCT/GB2008/001464
Other languages
French (fr)
Inventor
Mandana Zarrabi Jenabzadeh
Original Assignee
Sony Computer Entertainment Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Limited filed Critical Sony Computer Entertainment Europe Limited
Publication of WO2008145952A1 publication Critical patent/WO2008145952A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1077Measuring of profiles
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41HAPPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
    • A41H1/00Measuring aids or methods
    • A41H1/02Devices for taking measurements on the human body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7232Signal processing specially adapted for physiological signals or for diagnostic purposes involving compression of the physiological signal, e.g. to extend the signal recording period
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character

Definitions

  • the present invention relates to an apparatus and method of body characterisation.
  • Methods of scanning the human body have proliferated in recent years, moving from X-Ray and magnetic resonance imaging (MRI) to so-called 'T-Ray' (Tera Hertz frequency) and low-power microwave scanners.
  • MRI magnetic resonance imaging
  • 'T-Ray' Transmission Hertz frequency
  • the high-energy imaging systems such as X-Ray and MRI are intended to penetrate the body and reveal internal structures, primarily for medical diagnosis.
  • the mid-energy imaging systems such as T-Ray and microwave imagers, can penetrate clothing but not skin, and are being introduced for applications such as airport security, in order to reveal concealed weapons and the like (see http://optics.org/cws/article/research/22714).
  • a body characterising system comprises a computing device; an optical imaging device to capture one or more images; and a scale estimator comprising an optical distance measuring device; the computing device comprising: an image processor to identify the image of the user within the background of the one or more captured images; and in which the computing device is operable to estimate the scale of the image of the user from the scale estimator, and to generate data descriptive of the user's body dependent upon to the identified image of the user and the estimated scale of the identified image of the user.
  • a system for selecting goods and/or services according to body characteristics comprises: a server providing access to an online retailer; and a body characterising system according to the first aspect, the server and the body measurement system each comprising respective communication means for communication over a network connection; and in which the server is operable to select from among a plurality of options for goods and/or services offered by the online retailer according to some or all of the data descriptive of a user's body transmitted over the network connection to the server by the body measurement system communication means.
  • the body characterising system employs comparatively cheap and compact optical imaging means to measure the user, making automated characterisations of the user's body affordable and practical at home.
  • the system can use optical imaging because it exploits the fact that in their own home a user can be measured naked or in their underwear, and therefore high-energy scanners that can penetrate clothing are not required.
  • these characterisations of the user's body can be used to select, for example, the appropriate sized garment for shipping to the user during an on-line purchase.
  • Figure 1 is a schematic diagram of an entertainment device
  • Figure 2 is a schematic diagram of a cell processor
  • Figure 3 is a schematic diagram of a video graphics processor
  • Figure 4 is a schematic diagram of a user interacting with a body characterisation system in accordance with an embodiment of the present invention.
  • Figure 5A is an optical measurement device in accordance with an embodiment of the present invention.
  • Figure 5 B is an optical measurement device in accordance with an embodiment of the present invention.
  • Figure 6 is a schematic diagram of various measurements on a human body in accordance with an embodiment of the present invention.
  • Figure 7 is a schematic diagram of a distributed body characterisation system in accordance with an embodiment of the present invention.
  • Figure 8 is a flow diagram of a method of body characterisation in accordance with an embodiment of the present invention.
  • an entertainment device such as a personal computer or a Sony ® Playstation 3 ® entertainment machine is connected to a webcam.
  • the entertainment device obtains basic information about the user such as their height, either by requesting the data to be entered manually or by measuring the user's height and/or distance from the camera directly using infra-red or ultrasound sensors.
  • the entertainment device provides visual feedback to the user enabling them to adopt one or more poses for image capture. From these poses and the height and/or distance of the user from the camera the entertainment device can determine the per-pixel scale of the user image received and thus the user's physical dimensions. From these, the entertainment device can derive the user's vital statistics.
  • the device is able to make use of simple and cheap optical frequency imaging technology such as a webcam since, in the privacy of their own home, a person can measure themselves naked or in their underwear, and thus provide an accurate optical measurement of body shape without the need for high-energy scanners such as those found in the prior art.
  • the results can then be stored by the entertainment device until the user chooses to update or discard them, and can be provided by the entertainment device to accredited on-line retailers to enable accurate selection of garment sizes for purchase by the user, thereby improving user satisfaction and saving significant costs on returned ill-fitting goods.
  • the entertainment device is a Sony ® Playstation 3s ® (PS3 ® ).
  • Figure 1 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device; a system unit 10 is provided, with various peripheral devices connectable to the system unit.
  • the system unit 10 comprises: a Cell processor 100; a Rambus® dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an I/O bridge 700.
  • the system unit 10 also comprises a BIu Ray® Disk BD-ROM® optical disk reader
  • the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700.
  • the I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 730; and a Bluetooth® wireless link port 740 capable of supporting of up to seven Bluetooth connections.
  • USB Universal Serial Bus
  • the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100, which updates the current state of the game accordingly.
  • the wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756; and a microphone headset 757.
  • game controllers 751 such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756; and a microphone headset 757.
  • Such peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
  • Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
  • DVRs digital video recorders
  • set-top boxes digital cameras
  • portable media players Portable media players
  • Voice over IP telephones mobile telephones, printers and scanners.
  • a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the Playstation® or Playstation 2® devices.
  • the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link.
  • the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751.
  • the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands.
  • other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller.
  • additional game or control information may be provided on the screen of the device.
  • Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
  • the remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link.
  • the remote control 752 comprises controls suitable for the operation of the BIu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
  • the BIu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs.
  • the reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs.
  • the reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
  • the system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310.
  • the audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 1080i or 108Op high definition.
  • Audio processing (generation, decoding and so on) is performed by the Cell processor 100.
  • the Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu- Ray® disks.
  • DTS Dolby® Theatre Surround
  • the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10.
  • the camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions.
  • Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port.
  • Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data.
  • the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
  • a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10
  • an appropriate piece of software such as a device driver should be provided.
  • Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
  • the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements
  • Element Interconnect Bus 180 The total floating point performance of the Cell processor is
  • the Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (Ll) cache.
  • the PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz.
  • the primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 11 OA-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 11 OA-H and monitoring their progress. Consequently each Synergistic Processing Element 11 OA-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 150.
  • Each Synergistic Processing Element (SPE) 11 OA-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) 140 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142 A-H, a respective Memory Management Unit (MMU) 144 A-H and a bus interface (not shown).
  • SPU 120 A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130A-H, expandable in principle to 4 GB.
  • Each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
  • An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation.
  • the SPU 120A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the MFC 140 A-H which instructs its DMA controller 142A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160.
  • the Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface 170A,B and the 8 SPEs 1 10A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 11 OA-H comprises a DMAC 142 A-H for scheduling longer read or write sequences.
  • the EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step- wise data-flow between any two participants is six steps in the appropriate direction.
  • the theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
  • the memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated.
  • the memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
  • the dual bus interface 170A,B comprises a Rambus FlexIO® system interface 172A,B.
  • the interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
  • Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
  • the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia® G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100.
  • the RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 17OB of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output.
  • the RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s.
  • DDR double data rate
  • VRAM video RAM
  • the VRAM 250 maintains a frame buffer 214 and a texture buffer 216.
  • the texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines.
  • the RSX can also access the main memory 500 via the EIB- 180, for example to load textures into the VRAM 250.
  • the vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
  • the pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel.
  • Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
  • the render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image.
  • the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency.
  • the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
  • Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second.
  • the total floating point performance of the RSX 200 is 1.8 TFLOPS.
  • the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene.
  • the PPU 155 of the Cell processor may schedule one or more SPEs 11 OA-H to compute the trajectories of respective batches of particles.
  • the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller 170B.
  • the or each SPE 110A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE 11 OA-H addresses the video RAM 250 via the bus interface controller 170B.
  • the assigned SPEs become part of the video processing pipeline for the duration of the task.
  • the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled.
  • the disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process.
  • the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
  • the PPU 155 can assign tasks to SPEs in several ways.
  • SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE.
  • two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
  • Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.
  • the software supplied at manufacture comprises system firmware and the Playstation
  • the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video.
  • the interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally.
  • the user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751, remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion.
  • the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400).
  • the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available.
  • the on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term "on-line” does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
  • the entertainment device 10 is connected to the display device 300 (e.g. a television), and to a so-called 'z-cam' 1 100 such as that provided by PrimeSense ® (see http://www.primesense.com/), in place of the video camera 756.
  • the z-cam 1100 comprises a charge coupled device (CCD) 1001 or other imaging means, together with a light emitter 1060 operable to emit a pattern of light, such as a grid of lines 1070.
  • CCD charge coupled device
  • This emitter may be a low-power laser able to rapidly 'draw' the pattern by steering of the laser's output beam (by known refractive and/or reflective elements), or a light source coupled with a mask 1080 upon which the pattern is placed.
  • the emitter operates in the near- infrared, which is invisible to the user but can be seen by the CCD and thus captured for analysis.
  • the distortion and scaling of the pattern can be used to determine the distance between the emitter and the point of reflection, thereby generating an array of distance or depth values in the notional z-axis of the captured image.
  • the resolution of the illuminated pattern will be limited by the quality of the optics and by effects such as diffraction that blur the pattern and so limit how closely lines in the pattern can be spaced.
  • the resolution of the depth measurement is dependent upon the resolution of the pattern, optionally several patterns can be used in quick succession (for example a grid pattern followed by the same grid pattern offset Vi square vertically and horizontally) and combined to give, in this example, double the depth resolution possible from either pattern on its own. This may be achieved by alternatively illuminating two physically offset masks in quick succession and in synchronisation with respective successive capture cycles of the CCD.
  • the plurality of patterns may be used simultaneously.
  • the z-cam 1100 comprises a CCD 1001 or other imaging means, together with a remote distance measuring means such as an infra-red or ultrasound source and receiver, to measure reflection path times and so determine distances.
  • the distance measuring means is arranged to form an array of distance measurements that can be correlated with the captured image, thereby augmenting the captured image with depth information (i.e. on the z-axis) with respect to the camera.
  • the array of optical measurements can be achieved by, for example, use of one or more electrically-moveable micro-mirrors 1130 to steer the infra-red signal in a manner similar to the use of micro mirrors in digital light projector (DLP) televisions and projectors.
  • DLP digital light projector
  • the infra red signal will be from a solid-state laser 1120.
  • a modulator 1110 provides modulation of the light signal which, coupled with the steering of the micro-mirrors allows correlation with the reflected signals to reconstruct the reflected path distance as the mirror or mirrors conduct a sweep of the observed area.
  • the reflected signal is received by a detector 1140 and correlated with the original signal by an auto-correlator 1150 to determine the path delay.
  • a steered emitter array forming a narrow acoustic beam can be used with similar signal modulation and correlation to that described above.
  • visible light may also be used instead of or in addition to infra-red.
  • an on-screen prompt asks the user to adopt one or more poses for the purpose of determining their measurements (for example by displaying a semitransparent example figure 1012 in the appropriate pose on screen, and asking the user to match the pose as far as possible, by feeding the captured image back to the display under the semitransparent figure.
  • the z-cam facilitates easily cutting the user out of the captured image (identifying and isolating the user within the image) by relating the image pixels to the distance measurements, thereby determining foreground from background.
  • the 'cut out' image of the user could be superposed on the sample figure 1012, or on targets for face, foot or hand target positions, on the TV screen to provide visual feedback as to the user's position and pose.
  • the depth array information is of a lower resolution than the pixel array forming the image. Consequently, some background image data may be included within the cut-out.
  • Refinement of the cut-out image can be performed using known techniques, such as using colour matching based on one or more colour samples taken from well within the presumed area of the user (e.g. corresponding to one depth-array 'pixel' in from the cut-out edge) to reject outlying pixels that do not match.
  • edge detection and object discrimination methods are well known in the art.
  • the refined cut-out image is then available for display and further processes. It will be appreciated that such a cut-out may be literal, i.e. generating a new image comprising only the user image, or may logical, i.e. identifying the outline of the user within the captured image. The further processes are applicable to either case.
  • the best instance of the required pose that is achieved by the user can be determined by the percentage of the example figure occluded by the user, or when all the targets are occluded or 'touched' by the user, using detection methods known in the art, for example in Eye-Toy ® games.
  • the user can press a button on a controller 751 to indicate that they think they have achieved the best pose.
  • This best example can be processed (optionally in conjunction with other captured poses) to determine the user's measurements as described below.
  • the depth, or distance, measurements taken by the z-cam can be used by the entertainment device to determine the effective scale of the captured image of the user, and consequently the centimetre- or inch-per-pixel scale with which the person is represented in the captured image.
  • the user may be invited to stand closer to the z-cam and try again, and/or to change lighting conditions.
  • the on-screen template can be increased in size so that the user must occupy a greater proportion of the captured image in order to match the template.
  • the proportion of the captured image that needs to be occupied by the user in order to achieve a suitable scale is dependent upon the resolution of the imaging means 1001. For example, an imaging means with a standard resolution of 640 x 480 pixels would require a person of average height (e.g. 180cm tall) to occupy 40% of the captured image by height in order to achieve a scale of one centimetre per pixel.
  • the scale estimate takes account of any refinement to the cut-out image as described above.
  • the user can explicitly input their height to the entertainment device through a user interface (e.g. the game controller 751).
  • the cut-out image of the user can be analysed to determine measurements such as inside leg length, arm length, neck, hip, waist, chest and bust measurements and bra size.
  • measurements such as inside leg length, arm length, neck, hip, waist, chest and bust measurements and bra size.
  • a face-on image of a person in a 'star' pose (as seen in Figure 4), and face- on image and a profile image of the person standing upright with arms at their side are used to determine the measurements, although optionally a back image and the remaining side image can also be captured.
  • activity-specific poses may optionally be selected, such as bending forward in profile to capture the shape and extension of the back for clothing related to cycling or rowing, for example.
  • the inside leg length is determined using the 'star' pose.
  • the distance L2 from the bottom of the image to the point where the legs meet is determined by tracking measurement Ll.
  • the measurement shrinks to zero where the legs meet.
  • the distance between this point and the inside edge of the bottom of the leg is then measured.
  • both legs can be measured and averaged to get a more accurate estimate.
  • a proportional reduction can be used to determine length to the ankle using average physiological data, optionally differentiated by sex, or alternatively a foot template may be adjusted to fit the user's image in order to estimate ankle height.
  • Additional measurements can be determined by measuring the width of the leg at empirically determined average positions representative of •calf or thigh size, and compared with known body morphologies to categorise the shape of the leg (optionally also using other images, such as a profile image). This information can help to determine which cut of trousers, for example, will best fit the user.
  • waist size is determined using the profile image of the person standing and the face-on image of the person standing or in star pose. Given the position of the top of the users' leg based upon analysis of the 'star' pose as detailed above, then the next narrowest point Wl above this position will determine waist width. The corresponding position on the profile image provides a second measurement W2, allowing an estimation of the circumference of the user about the waist based upon average circumferences for that combination of measurements, or alternatively according to a model of waist shape into which the measurements are input.
  • the z-cam distance information is also used to determine a profile of the user about this point, and can be mirrored to generate a complete profile. This profile can be used in conjunction with measurements Wl and W2 to refine the estimate of circumference or as a further input to the model of waist shape.
  • the hip measurement operates in similar fashion, based upon the next widest point above the top of the leg, and/or below the waist.
  • the chest or bust measurement is determined using either the face-on 'star' or standing pose, and the profile image of the person standing.
  • a first measurement Bl is the widest point in profile above the waist, or within a region with respect to shoulder height or head position as determined from average physiological data.
  • the corresponding position in the face-on image provides the second measurement B2.
  • Estimation of the circumference is then similar to that described for hips and waist, and again optionally may include z-cam distance information to generate a profile.
  • bra size is determined directly from z- cam distance measurements at the level of measurement B2, to compare the change in distance over the central region of the bust.
  • the profile image is used to determine bra cup size by identifying a measuring point immediately below the breast for comparison. Such a point is determined by measuring the width of the profile image as the measurement position moves successively down from point Bl; the rate of change (i.e. reduction) of width will change once the measurement position is below the breast. Comparing the size of the chest at this point with that of the bust measurement above gives measurement B3, again enabling an estimate of bra cup size.
  • the z-cam estimate and the profile image estimate may be combined to refine the overall measurement.
  • arm length is determined in a similar fashion to leg length. For example, by determining when the arm joins the vertically extending main body, the length Al from fingertip to armpit may be determined.
  • a hand template may be scaled to fit the user's image in order to estimate a wrist position for clothing.
  • Measurement estimates are then stored on the PS3 hard drive, and optionally are associated with the user's account if more than one person uses the PS3.
  • the data may be password-protected and/or encrypted.
  • a summary of the main measurements may be given to the user, as these may also be helpful for conventional, high-street shopping.
  • the part-processed captured data may also be stored. This includes the cut-out images of the user and optionally the distance data obtained from the z- cam. In the former case, these may be saved as silhouettes to preserve privacy. Alternatively or in addition, cut-out 'images' where each pixel value represents measured or interpolated distance values, either absolute or from a common reference point (e.g. mean distance) can be saved, thereby combining both cut-out and distance data as a distance map of the user. This may be formatted in a conventional and easily transferable data format such as a CompuServe graphic interchange format (GIF) file or similar.
  • Such part-processed captured data can for convenience be referred to as a physical profile of the user. The physical profile will also comprise the scale of the profile data as necessary.
  • the user can determine which data they are comfortable having stored on the hard drive, and in what form.
  • the physical profile may comprise a full or partial 3D mesh of the user constructed from the depth data.
  • the body image (or just the face) of the user may be transformed to act as a default texture for the 3D mesh.
  • This mesh may be used, for example, as a virtual mannequin to demonstrate clothing available from an on-line retailer.
  • the PS3 described above can connect online to a Sony-run server 2010 hosting retail services, or administering one or more virtual environments within which virtual retailers can be found.
  • the Sony-run server may connect to one or more third party servers 3010 (of which only one is shown for illustration) also providing ranges of goods and services for sale.
  • the PS 3 can connect directly to such third party servers in a conventional manner.
  • a user may subsequently browse the internet, for example visiting the online Sony® Playstation store, or visiting a retailer in an online virtual environment, and may search for a garment.
  • the online store or retailer can request the user's body data, being body measurements and/or physical profile data from the PS3, to determine which size garment will best fit the user, and whether this is available.
  • This request and the transmission of the data can be via a secure connection (i.e. encrypted).
  • the transmission of the data may require authorisation from the user, in the form of a yes/no choice or the entry of a password.
  • an online store or retailer must present an electronic certificate demonstrating that it treats such data in confidence, that its size selection process is consistent with a predefined quality level before the PS3 will transmit the users measurements and/or part-processed captured data, and/or that it is officially licensed to operate such a service.
  • the server receives the measurement data and compares it to a table or database identifying the closest garment size that corresponds to the user's measurements.
  • the relevant garment can be suggested to the user.
  • the user may be presented with both options, and/or may be prompted to look at a similar item for which there is a better fit (e.g. between different cuts of jeans).
  • the garment size may be only one of several criteria used in a search, others being for example price, brand, or availability.
  • the server performs measurements on the physical profile data in a manner as described above to determine the required user measurement.
  • the measurement system is distributed between the PS3 and the retailer's server, enabling measurements to be tailored to the requirements of the retailer.
  • an on-line purchase of the garment may be made in a conventional manner.
  • 'garment' may refer not only to clothes but to other items dependent upon body shape and size, such as rucksacks. It will be also understood that the present invention is applicable for the selection of other items that may depend upon body size such as bicycles, and also to furniture such as beds, chairs, tables and other domestic and commercial furniture. Other items may include sports equipment such as golf clubs or cricket bats that can be selected according to the height of the user. Likewise, potentially other services such as travel services (e.g. air travel) or theatre seating reservation may use height information to allocate a passenger to a wider row during an online booking.
  • travel services e.g. air travel
  • theatre seating reservation may use height information to allocate a passenger to a wider row during an online booking.
  • the online system may be used for more than just the selection of goods and services according to body measurement.
  • Servers can accumulate and analyse measurement data to determine customer trends, thereby allowing retailers to stock goods according to the frequency of measured dimensions encountered, and tracking any changes in body shape over the seasons, years or demographics.
  • Other applications may involve promotions such as, for example, a 'Cinderella' competition, where a garment is selected or made to fit a specific body shape, and is awarded to a person whose body measurements or physical profile data best fit that shape.
  • Other variations include analysing the skin tone of the user and determining complementary colours, so that in addition to size, styling advice such as the recommendation of garments of a particular colour can be provided or highlighted during a search.
  • Another variation includes generating and maintaining a database of hairstyles, for example produced and sponsored by leading hairdressers, which particularly suit certain face shapes. By taking a close-up measurement of the face and referring to the database, a user may receive suggestions for hairstyles and/or beauty products that would complement them.
  • a similar approach may be used for styles of spectacles, and for make-up, optionally in conjunction with the skin tone analysis.
  • the distance data generated by the z-cam can augment and/or cross-check the measurements determined from the captured images, as described above it is possible to determine these measurements without the distance data. Consequently, a conventional webcam such as the video camera 756 may be used instead of the z-cam.
  • alternative methods may be required to identify and isolate the user image from the background image. These include using skin tone to identify the user, asking the user to move slightly to determine the edges of the user image with respect to the background.
  • Other techniques for identifying the human form in an image are known in the art, such as skeletal modelling, template matching and image recognition by neural networks. Such recognition methods may be assisted by using the TV as a variable light source, e.g. by illuminating the room in various flesh-tone hues in order to increase the contrast between the user and the background image.
  • the user can input their height.
  • the user could hold a reference object of known size (for example, a fluorescent ruler or disc or a light source provided as part of the measurement system) in their hand during their poses.
  • a reference object of known size for example, a fluorescent ruler or disc or a light source provided as part of the measurement system
  • a disc has the benefit that it will always have at least one axis (the major axis) that is the correct size regardless the angle at which it is held.
  • a ball is an even more invariant indicator of size, but is comparatively bulky and may not be desirable to package with the system.
  • Using such a reference object avoids the need to relate the user's height to their captured image, or allows for confirmation/adjustment of any attempt to relate the user's height to their image. By using two metrics to determine the basic scale of the image in this way, the resulting estimate may be more reliable.
  • system is not limited to implementation on the PS3.
  • a personal computer or other suitable computing device may be used instead of the PS3.
  • a method of body measurement comprises a first step slO of capturing one or more images comprising a user.
  • the user's image is isolated from the background of the or each image.
  • the per-pixel scale of the user's image is determined.
  • data descriptive of the user's body (such as one or more measurements of the user's body, or physical profile data) are generated as described herein.
  • some or all of the descriptive data is stored on the PS3.
  • a corresponding method for selecting goods and services online comprises the above steps, and additionally comprises the steps of transmitting the data descriptive of the user's body to a server providing access to an on-line retailer, and the server selecting from among a plurality of options for goods and/or services offered by the online retailer according to some or all of the data descriptive of a user's body transmitted to the server.
  • the first step slO may comprise capturing distance data.
  • the fourth step s40 may comprise combining image and depth data
  • the fifth step s50 may comprise storing said image and distance data (i.e. physical profile data).
  • the scale of the image with respect to the user may be determined before or after the user is isolated from the image (for example by measuring the reference object if found in the image). It will also be appreciated that whilst it is preferred that the user's body data is stored, potentially the data could be acquired on demand when shopping online, in which case it need not be placed in long-term storage on the PS3.
  • embodiments of the present invention can be implemented in the PS3 or personal computer and in on-line servers by suitably programming one or more processors therein.
  • a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array

Abstract

A body characterising system comprises a computing device, an optical imaging device to capture one or more images, and a scale estimator comprising an optical distance measuring device, the computing device comprising an image processor to identify the image of the user within the background of the one or more captured images, and in which the computing device is operable to estimate the scale of the image of the user from the scale estimator, and to generate data descriptive of the user's body dependent upon to the identified image of the user and the estimated scale of the identified image of the user.

Description

APPARATUS AND METHOD OF BODY CHARACTERISATION
The present invention relates to an apparatus and method of body characterisation. Methods of scanning the human body have proliferated in recent years, moving from X-Ray and magnetic resonance imaging (MRI) to so-called 'T-Ray' (Tera Hertz frequency) and low-power microwave scanners.
The high-energy imaging systems such as X-Ray and MRI are intended to penetrate the body and reveal internal structures, primarily for medical diagnosis. The mid-energy imaging systems, such as T-Ray and microwave imagers, can penetrate clothing but not skin, and are being introduced for applications such as airport security, in order to reveal concealed weapons and the like (see http://optics.org/cws/article/research/22714).
The ability to detect skin through clothing has also been used to generate measurements of clothed people for the clothing industry; for example, Intellifit ® provides a large booth comprising a microwave scanner that can derive measurements of a person standing in the booth that are relevant to the selection of clothes suitable for that person (e.g. see http://www.demo.com/demonstrators/demo2005/54162.php, or http://www.wired.eom/science/discoveries/news/2006/09/71813).
However, such devices are large and expensive, and consequently in practice are only available in some shops. As a result they are of no use, for example, to the private individual who wishes to purchase clothing on-line through their home computer. The present invention seeks to address the above problem.
Various respective aspects and features of the invention are defined in the appended claims. Combinations of features from the dependent claims may be combined with features of the independent claims as appropriate and not merely as explicitly set out in the claims.
In a first aspect of the present invention, a body characterising system comprises a computing device; an optical imaging device to capture one or more images; and a scale estimator comprising an optical distance measuring device; the computing device comprising: an image processor to identify the image of the user within the background of the one or more captured images; and in which the computing device is operable to estimate the scale of the image of the user from the scale estimator, and to generate data descriptive of the user's body dependent upon to the identified image of the user and the estimated scale of the identified image of the user. In a another aspect of the present invention, a system for selecting goods and/or services according to body characteristics, comprises: a server providing access to an online retailer; and a body characterising system according to the first aspect, the server and the body measurement system each comprising respective communication means for communication over a network connection; and in which the server is operable to select from among a plurality of options for goods and/or services offered by the online retailer according to some or all of the data descriptive of a user's body transmitted over the network connection to the server by the body measurement system communication means.
Advantageously, the body characterising system employs comparatively cheap and compact optical imaging means to measure the user, making automated characterisations of the user's body affordable and practical at home. The system can use optical imaging because it exploits the fact that in their own home a user can be measured naked or in their underwear, and therefore high-energy scanners that can penetrate clothing are not required.
In conjunction with a server, these characterisations of the user's body can be used to select, for example, the appropriate sized garment for shipping to the user during an on-line purchase.
Further respective aspects and features of the invention are defined in the appended claims.
Embodiments of the invention will now be described with reference to the accompanying drawings, throughout which like parts are referred to by like references, and in which:
Figure 1 is a schematic diagram of an entertainment device;
Figure 2 is a schematic diagram of a cell processor;
Figure 3 is a schematic diagram of a video graphics processor; Figure 4 is a schematic diagram of a user interacting with a body characterisation system in accordance with an embodiment of the present invention.
Figure 5A is an optical measurement device in accordance with an embodiment of the present invention.
Figure 5 B is an optical measurement device in accordance with an embodiment of the present invention.
Figure 6 is a schematic diagram of various measurements on a human body in accordance with an embodiment of the present invention.
Figure 7 is a schematic diagram of a distributed body characterisation system in accordance with an embodiment of the present invention. Figure 8 is a flow diagram of a method of body characterisation in accordance with an embodiment of the present invention.
In a summary example embodiment of the present invention, an entertainment device such as a personal computer or a Sony ® Playstation 3 ® entertainment machine is connected to a webcam. The entertainment device obtains basic information about the user such as their height, either by requesting the data to be entered manually or by measuring the user's height and/or distance from the camera directly using infra-red or ultrasound sensors. The entertainment device provides visual feedback to the user enabling them to adopt one or more poses for image capture. From these poses and the height and/or distance of the user from the camera the entertainment device can determine the per-pixel scale of the user image received and thus the user's physical dimensions. From these, the entertainment device can derive the user's vital statistics. The device is able to make use of simple and cheap optical frequency imaging technology such as a webcam since, in the privacy of their own home, a person can measure themselves naked or in their underwear, and thus provide an accurate optical measurement of body shape without the need for high-energy scanners such as those found in the prior art. The results can then be stored by the entertainment device until the user chooses to update or discard them, and can be provided by the entertainment device to accredited on-line retailers to enable accurate selection of garment sizes for purchase by the user, thereby improving user satisfaction and saving significant costs on returned ill-fitting goods.
Referring now to Figure 1 , in an embodiment of the present invention, the entertainment device is a Sony® Playstation 3s® (PS3®). Figure 1 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device; a system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises: a Cell processor 100; a Rambus® dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an I/O bridge 700. The system unit 10 also comprises a BIu Ray® Disk BD-ROM® optical disk reader
430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400, accessible through the I/O bridge 700. Optionally the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 700. The I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 730; and a Bluetooth® wireless link port 740 capable of supporting of up to seven Bluetooth connections. In operation the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100, which updates the current state of the game accordingly.
The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 756; and a microphone headset 757. Such peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
In addition, a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the Playstation® or Playstation 2® devices.
In the present embodiment, the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link. However, the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751. In addition to one or more analogue joysticks and conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown). The remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link. The remote control 752 comprises controls suitable for the operation of the BIu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
The BIu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks. The system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310. The audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 1080i or 108Op high definition.
Audio processing (generation, decoding and so on) is performed by the Cell processor 100. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu- Ray® disks.
In the present embodiment, the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions. Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
Referring now to Figure 2, the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements
(SPEs) 11 OA-H; and a circular data bus connecting the above components referred to as the
Element Interconnect Bus 180. The total floating point performance of the Cell processor is
218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (Ll) cache. The PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 11 OA-H, which handle most of the computational workload. In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 11 OA-H and monitoring their progress. Consequently each Synergistic Processing Element 11 OA-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 150.
Each Synergistic Processing Element (SPE) 11 OA-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) 140 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142 A-H, a respective Memory Management Unit (MMU) 144 A-H and a bus interface (not shown). Each SPU 120 A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 120A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the MFC 140 A-H which instructs its DMA controller 142A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160.
The Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface 170A,B and the 8 SPEs 1 10A-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 11 OA-H comprises a DMAC 142 A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step- wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
The memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
The dual bus interface 170A,B comprises a Rambus FlexIO® system interface 172A,B. The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on. Referring now to Figure 3, the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia® G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100. The RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 17OB of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output. The RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, the VRAM 250 maintains a frame buffer 214 and a texture buffer 216. The texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines. The RSX can also access the main memory 500 via the EIB- 180, for example to load textures into the VRAM 250. The vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
The pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
The render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of the RSX 200 is 1.8 TFLOPS.
Typically, the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, the PPU 155 of the Cell processor may schedule one or more SPEs 11 OA-H to compute the trajectories of respective batches of particles. Meanwhile, the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller 170B. The or each SPE 110A-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE 11 OA-H addresses the video RAM 250 via the bus interface controller 170B. Thus in effect the assigned SPEs become part of the video processing pipeline for the duration of the task.
In general, the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor. The PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE. Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above. Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these. The software supplied at manufacture comprises system firmware and the Playstation
3 device's operating system (OS). In operation, the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navigates by moving through the function icons (representing the functions) horizontally using the game controller 751, remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollable list of option icons centred on that function icon, which may be navigated in analogous fashion. However, if a game, audio or movie disk 440 is inserted into the BD- ROM optical disk reader 430, the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400). In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term "on-line" does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
Referring now to Figures 4, 5A and 5B, in an embodiment of the present invention the entertainment device 10 is connected to the display device 300 (e.g. a television), and to a so-called 'z-cam' 1 100 such as that provided by PrimeSense ® (see http://www.primesense.com/), in place of the video camera 756. In an embodiment of the present invention as seen in Figure 5 A, the z-cam 1100 comprises a charge coupled device (CCD) 1001 or other imaging means, together with a light emitter 1060 operable to emit a pattern of light, such as a grid of lines 1070. This emitter may be a low-power laser able to rapidly 'draw' the pattern by steering of the laser's output beam (by known refractive and/or reflective elements), or a light source coupled with a mask 1080 upon which the pattern is placed. Typically the emitter operates in the near- infrared, which is invisible to the user but can be seen by the CCD and thus captured for analysis. By comparing the reference pattern with the pattern as projected on to the user, the distortion and scaling of the pattern can be used to determine the distance between the emitter and the point of reflection, thereby generating an array of distance or depth values in the notional z-axis of the captured image.
Generally the resolution of the illuminated pattern will be limited by the quality of the optics and by effects such as diffraction that blur the pattern and so limit how closely lines in the pattern can be spaced. As the resolution of the depth measurement is dependent upon the resolution of the pattern, optionally several patterns can be used in quick succession (for example a grid pattern followed by the same grid pattern offset Vi square vertically and horizontally) and combined to give, in this example, double the depth resolution possible from either pattern on its own. This may be achieved by alternatively illuminating two physically offset masks in quick succession and in synchronisation with respective successive capture cycles of the CCD. Alternatively, if the patterns can be adequately differentiated by the CCD, the plurality of patterns may be used simultaneously.
In another embodiment of the present invention as seen in figure 5B, the z-cam 1100 comprises a CCD 1001 or other imaging means, together with a remote distance measuring means such as an infra-red or ultrasound source and receiver, to measure reflection path times and so determine distances. The distance measuring means is arranged to form an array of distance measurements that can be correlated with the captured image, thereby augmenting the captured image with depth information (i.e. on the z-axis) with respect to the camera.
The array of optical measurements can be achieved by, for example, use of one or more electrically-moveable micro-mirrors 1130 to steer the infra-red signal in a manner similar to the use of micro mirrors in digital light projector (DLP) televisions and projectors. Typically the infra red signal will be from a solid-state laser 1120. A modulator 1110 provides modulation of the light signal which, coupled with the steering of the micro-mirrors allows correlation with the reflected signals to reconstruct the reflected path distance as the mirror or mirrors conduct a sweep of the observed area. The reflected signal is received by a detector 1140 and correlated with the original signal by an auto-correlator 1150 to determine the path delay. If the timing of the mirror movement is long compared with the reflection path time such that each detection is completed before the mirror/s move again, then correlation may not be necessary to distinguish each path. For ultrasound, a steered emitter array forming a narrow acoustic beam can be used with similar signal modulation and correlation to that described above.
In either embodiment, it will be appreciated that visible light may also be used instead of or in addition to infra-red.
When the entertainment device is connected to the display 300 and the z-cam 1100, in an embodiment of the present invention an on-screen prompt asks the user to adopt one or more poses for the purpose of determining their measurements (for example by displaying a semitransparent example figure 1012 in the appropriate pose on screen, and asking the user to match the pose as far as possible, by feeding the captured image back to the display under the semitransparent figure. The z-cam facilitates easily cutting the user out of the captured image (identifying and isolating the user within the image) by relating the image pixels to the distance measurements, thereby determining foreground from background. Thus for example the 'cut out' image of the user could be superposed on the sample figure 1012, or on targets for face, foot or hand target positions, on the TV screen to provide visual feedback as to the user's position and pose.
In general, the depth array information is of a lower resolution than the pixel array forming the image. Consequently, some background image data may be included within the cut-out. Refinement of the cut-out image can be performed using known techniques, such as using colour matching based on one or more colour samples taken from well within the presumed area of the user (e.g. corresponding to one depth-array 'pixel' in from the cut-out edge) to reject outlying pixels that do not match. Such edge detection and object discrimination methods are well known in the art. The refined cut-out image is then available for display and further processes. It will be appreciated that such a cut-out may be literal, i.e. generating a new image comprising only the user image, or may logical, i.e. identifying the outline of the user within the captured image. The further processes are applicable to either case.
The best instance of the required pose that is achieved by the user can be determined by the percentage of the example figure occluded by the user, or when all the targets are occluded or 'touched' by the user, using detection methods known in the art, for example in Eye-Toy ® games. Alternatively, the user can press a button on a controller 751 to indicate that they think they have achieved the best pose. This best example can be processed (optionally in conjunction with other captured poses) to determine the user's measurements as described below. The depth, or distance, measurements taken by the z-cam can be used by the entertainment device to determine the effective scale of the captured image of the user, and consequently the centimetre- or inch-per-pixel scale with which the person is represented in the captured image. Optionally, if the scale is too small (for example, if measurements cannot be derived with less than ± Vi inch, i.e 1.25 cm, accuracy), then the user may be invited to stand closer to the z-cam and try again, and/or to change lighting conditions. To encourage the user to stand closer to the z-cam, the on-screen template can be increased in size so that the user must occupy a greater proportion of the captured image in order to match the template. The proportion of the captured image that needs to be occupied by the user in order to achieve a suitable scale is dependent upon the resolution of the imaging means 1001. For example, an imaging means with a standard resolution of 640 x 480 pixels would require a person of average height (e.g. 180cm tall) to occupy 40% of the captured image by height in order to achieve a scale of one centimetre per pixel.
The scale estimate takes account of any refinement to the cut-out image as described above. Alternatively or in addition, the user can explicitly input their height to the entertainment device through a user interface (e.g. the game controller 751).
Once an inch- or centimetre-per-pixel scale has been determined, the cut-out image of the user can be analysed to determine measurements such as inside leg length, arm length, neck, hip, waist, chest and bust measurements and bra size. Typically a face-on image of a person in a 'star' pose (as seen in Figure 4), and face- on image and a profile image of the person standing upright with arms at their side are used to determine the measurements, although optionally a back image and the remaining side image can also be captured. In addition, activity-specific poses may optionally be selected, such as bending forward in profile to capture the shape and extension of the back for clothing related to cycling or rowing, for example.
Referring now to Figure 6, in an embodiment of the present invention the inside leg length is determined using the 'star' pose. The distance L2 from the bottom of the image to the point where the legs meet is determined by tracking measurement Ll. As Ll is traversed up the image, the measurement shrinks to zero where the legs meet. The distance between this point and the inside edge of the bottom of the leg is then measured. Optionally, both legs can be measured and averaged to get a more accurate estimate. Also optionally, a proportional reduction can be used to determine length to the ankle using average physiological data, optionally differentiated by sex, or alternatively a foot template may be adjusted to fit the user's image in order to estimate ankle height. Additional measurements, such as calf and thigh width, can be determined by measuring the width of the leg at empirically determined average positions representative of •calf or thigh size, and compared with known body morphologies to categorise the shape of the leg (optionally also using other images, such as a profile image). This information can help to determine which cut of trousers, for example, will best fit the user.
In an embodiment of the present invention, waist size is determined using the profile image of the person standing and the face-on image of the person standing or in star pose. Given the position of the top of the users' leg based upon analysis of the 'star' pose as detailed above, then the next narrowest point Wl above this position will determine waist width. The corresponding position on the profile image provides a second measurement W2, allowing an estimation of the circumference of the user about the waist based upon average circumferences for that combination of measurements, or alternatively according to a model of waist shape into which the measurements are input. Optionally, the z-cam distance information is also used to determine a profile of the user about this point, and can be mirrored to generate a complete profile. This profile can be used in conjunction with measurements Wl and W2 to refine the estimate of circumference or as a further input to the model of waist shape.
The hip measurement operates in similar fashion, based upon the next widest point above the top of the leg, and/or below the waist.
In an embodiment of the present invention, the chest or bust measurement is determined using either the face-on 'star' or standing pose, and the profile image of the person standing. A first measurement Bl is the widest point in profile above the waist, or within a region with respect to shoulder height or head position as determined from average physiological data. The corresponding position in the face-on image provides the second measurement B2. Estimation of the circumference is then similar to that described for hips and waist, and again optionally may include z-cam distance information to generate a profile.
In an embodiment of the present invention, bra size is determined directly from z- cam distance measurements at the level of measurement B2, to compare the change in distance over the central region of the bust. Alternatively or in addition, the profile image is used to determine bra cup size by identifying a measuring point immediately below the breast for comparison. Such a point is determined by measuring the width of the profile image as the measurement position moves successively down from point Bl; the rate of change (i.e. reduction) of width will change once the measurement position is below the breast. Comparing the size of the chest at this point with that of the bust measurement above gives measurement B3, again enabling an estimate of bra cup size. The z-cam estimate and the profile image estimate may be combined to refine the overall measurement.
In an embodiment of the present invention, arm length is determined in a similar fashion to leg length. For example, by determining when the arm joins the vertically extending main body, the length Al from fingertip to armpit may be determined. A hand template may be scaled to fit the user's image in order to estimate a wrist position for clothing.
It will be apparent that other measurements, such as neck or stomach, may be made based on similar principles. It will be appreciated that some body shapes do not conform to these heuristics. The user's body may be compared to templates to categorise their body type, for example as an endomorph (round and soft), mesomorph (hourglass female, athletic male) or ectomorph (lean and slim), and different heuristics apparent to the person skilled in the art may then be applied appropriately. Such categorisation may also be more directly related to fashion requirements, such as a categorisation of leg type in terms of suitable cuts for trousers.
Measurement estimates are then stored on the PS3 hard drive, and optionally are associated with the user's account if more than one person uses the PS3. Optionally, the data may be password-protected and/or encrypted. A summary of the main measurements may be given to the user, as these may also be helpful for conventional, high-street shopping.
Alternatively or in addition, the part-processed captured data may also be stored. This includes the cut-out images of the user and optionally the distance data obtained from the z- cam. In the former case, these may be saved as silhouettes to preserve privacy. Alternatively or in addition, cut-out 'images' where each pixel value represents measured or interpolated distance values, either absolute or from a common reference point (e.g. mean distance) can be saved, thereby combining both cut-out and distance data as a distance map of the user. This may be formatted in a conventional and easily transferable data format such as a CompuServe graphic interchange format (GIF) file or similar. Such part-processed captured data can for convenience be referred to as a physical profile of the user. The physical profile will also comprise the scale of the profile data as necessary. Optionally, the user can determine which data they are comfortable having stored on the hard drive, and in what form.
Alternatively or in addition, the physical profile may comprise a full or partial 3D mesh of the user constructed from the depth data. The body image (or just the face) of the user may be transformed to act as a default texture for the 3D mesh. This mesh may be used, for example, as a virtual mannequin to demonstrate clothing available from an on-line retailer.
Referring now to Figure 7, the PS3 described above can connect online to a Sony-run server 2010 hosting retail services, or administering one or more virtual environments within which virtual retailers can be found. In turn the Sony-run server may connect to one or more third party servers 3010 (of which only one is shown for illustration) also providing ranges of goods and services for sale. Alternatively or in addition, the PS 3 can connect directly to such third party servers in a conventional manner. In use, a user may subsequently browse the internet, for example visiting the online Sony® Playstation store, or visiting a retailer in an online virtual environment, and may search for a garment. The online store or retailer can request the user's body data, being body measurements and/or physical profile data from the PS3, to determine which size garment will best fit the user, and whether this is available. This request and the transmission of the data can be via a secure connection (i.e. encrypted). Optionally, the transmission of the data may require authorisation from the user, in the form of a yes/no choice or the entry of a password.
Optionally, an online store or retailer must present an electronic certificate demonstrating that it treats such data in confidence, that its size selection process is consistent with a predefined quality level before the PS3 will transmit the users measurements and/or part-processed captured data, and/or that it is officially licensed to operate such a service.
The server receives the measurement data and compares it to a table or database identifying the closest garment size that corresponds to the user's measurements. In the case of an exact match, the relevant garment can be suggested to the user. In the case where there are several possible matches (e.g. one slightly large, one slightly small) the user may be presented with both options, and/or may be prompted to look at a similar item for which there is a better fit (e.g. between different cuts of jeans). It will be appreciated that the garment size may be only one of several criteria used in a search, others being for example price, brand, or availability.
In the case of only physical profile data being transmitted (or a measurement being required that was not transmitted), the server performs measurements on the physical profile data in a manner as described above to determine the required user measurement. In this way, the measurement system is distributed between the PS3 and the retailer's server, enabling measurements to be tailored to the requirements of the retailer.
Once a garment is selected by the user, an on-line purchase of the garment may be made in a conventional manner.
It will be understood that 'garment' may refer not only to clothes but to other items dependent upon body shape and size, such as rucksacks. It will be also understood that the present invention is applicable for the selection of other items that may depend upon body size such as bicycles, and also to furniture such as beds, chairs, tables and other domestic and commercial furniture. Other items may include sports equipment such as golf clubs or cricket bats that can be selected according to the height of the user. Likewise, potentially other services such as travel services (e.g. air travel) or theatre seating reservation may use height information to allocate a passenger to a wider row during an online booking.
It will be appreciated that the online system may be used for more than just the selection of goods and services according to body measurement. Servers can accumulate and analyse measurement data to determine customer trends, thereby allowing retailers to stock goods according to the frequency of measured dimensions encountered, and tracking any changes in body shape over the seasons, years or demographics. Other applications may involve promotions such as, for example, a 'Cinderella' competition, where a garment is selected or made to fit a specific body shape, and is awarded to a person whose body measurements or physical profile data best fit that shape.
Other variations include analysing the skin tone of the user and determining complementary colours, so that in addition to size, styling advice such as the recommendation of garments of a particular colour can be provided or highlighted during a search. Another variation includes generating and maintaining a database of hairstyles, for example produced and sponsored by leading hairdressers, which particularly suit certain face shapes. By taking a close-up measurement of the face and referring to the database, a user may receive suggestions for hairstyles and/or beauty products that would complement them. A similar approach may be used for styles of spectacles, and for make-up, optionally in conjunction with the skin tone analysis.
It will be appreciated that whilst the distance data generated by the z-cam can augment and/or cross-check the measurements determined from the captured images, as described above it is possible to determine these measurements without the distance data. Consequently, a conventional webcam such as the video camera 756 may be used instead of the z-cam. In this case, however, alternative methods may be required to identify and isolate the user image from the background image. These include using skin tone to identify the user, asking the user to move slightly to determine the edges of the user image with respect to the background. Other techniques for identifying the human form in an image are known in the art, such as skeletal modelling, template matching and image recognition by neural networks. Such recognition methods may be assisted by using the TV as a variable light source, e.g. by illuminating the room in various flesh-tone hues in order to increase the contrast between the user and the background image.
To determine the scale of the image in the absence of direct distance measurements, the user can input their height. However, it may be difficult to relate the user's height with the captured user's image, depending in particular upon hairstyle. Consequently, a heuristic relating total height to height at eye level (or ears, nose mouth or any other readily identifiable feature of the body) for a typical person may be used.
Alternatively or in addition, the user could hold a reference object of known size (for example, a fluorescent ruler or disc or a light source provided as part of the measurement system) in their hand during their poses. A disc has the benefit that it will always have at least one axis (the major axis) that is the correct size regardless the angle at which it is held.
A ball is an even more invariant indicator of size, but is comparatively bulky and may not be desirable to package with the system. Using such a reference object avoids the need to relate the user's height to their captured image, or allows for confirmation/adjustment of any attempt to relate the user's height to their image. By using two metrics to determine the basic scale of the image in this way, the resulting estimate may be more reliable.
It will be understood that the system is not limited to implementation on the PS3. For example, a personal computer or other suitable computing device may be used instead of the PS3.
Referring now to Figure 8, in an embodiment of the present invention a method of body measurement comprises a first step slO of capturing one or more images comprising a user. In a second step s20, the user's image is isolated from the background of the or each image. In a third step s30, the per-pixel scale of the user's image is determined. In a fourth step s40, data descriptive of the user's body (such as one or more measurements of the user's body, or physical profile data) are generated as described herein. In a fifth step s50, some or all of the descriptive data is stored on the PS3.
A corresponding method for selecting goods and services online comprises the above steps, and additionally comprises the steps of transmitting the data descriptive of the user's body to a server providing access to an on-line retailer, and the server selecting from among a plurality of options for goods and/or services offered by the online retailer according to some or all of the data descriptive of a user's body transmitted to the server.
In addition, for either method the first step slO may comprise capturing distance data. Alternatively or in addition, the fourth step s40 may comprise combining image and depth data, and the fifth step s50 may comprise storing said image and distance data (i.e. physical profile data).
It will be appreciated that the scale of the image with respect to the user may be determined before or after the user is isolated from the image (for example by measuring the reference object if found in the image). It will also be appreciated that whilst it is preferred that the user's body data is stored, potentially the data could be acquired on demand when shopping online, in which case it need not be placed in long-term storage on the PS3.
In any event, it will be apparent to a person skilled in the art that variations in the above methods corresponding to operation of the various embodiments of the apparatus disclosed herein are considered to be within the scope of the present invention, including but not limited to:
Saving image data of the user as a silhouette;
Transmitting image and optionally depth data to an online retailer; and Generating measurements from transmitted image and depth data at the server of the online retailer.
It will be appreciated that embodiments of the present invention can be implemented in the PS3 or personal computer and in on-line servers by suitably programming one or more processors therein. Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device.

Claims

1. A body characterising system, comprising: a computing device; an optical imaging device to capture one or more images; and a scale estimator comprising an optical distance measuring device; the computing device comprising: an image processor to identify the image of the user within the background of the one or more captured images; and in which the computing device is operable to estimate the scale of the image of the user from the scale estimator, and to generate data descriptive of the user's body dependent upon to the identified image of the user and the estimated scale of the identified image of the user.
2. A body characterising system according to claim 1, in which the optical distance measuring device uses visible or infra-red light.
3. A body characterising system according to claim 1 or claim 2, in which the optical distance measuring device comprises: a light emitter; a light detector; and one or more moveable mirrors; and in which the path length of light reflected in successive different directions by one or more moveable mirrors is detected and measured to construct an array of distance measurements.
4. A body characterising system according to claim 1 or claim 2, in which the optical distance measuring device comprises: a light pattern projector; and in which the light pattern projector is arranged to project a predetermined pattern of light; the optical imaging device is arranged to capture one or more images comprising the user illuminated by said pattern of light; and the image processor is arranged to compare the pattern of light captured in one or more of the captured images with the predetermined pattern, and to calculate the distance to illuminated sections of the pattern on the user that is required to account for differences between the captured pattern of light illuminating the user and the predetermined pattern.
5. A body characterising system according to claim 4, in which the light pattern projector is operable to project two or more predetermined patterns of light.
6. A body characterising system according to claim 5, in which the two or more predetermined patterns are substantially identical but spatially offset with respect to each other.
7. A body characterising system according to any one of the preceding claims, in which the data descriptive of the user's body comprises one or more measurements selected from the list consisting of: i. inside leg length; ii. arm length; iii. hip size; iv. waist size; v. chest size; vi. bust size; and vii. bra cup size.
8. A body characterising system according to any one of the preceding claims, in which the data descriptive of the user's body comprises one or more selected from the list consisting of: i. one or more isolated images of the user; ii. one or more isolated silhouettes of the user; iii. one or more distance maps of the user; and iv. a full or partial 3D mesh of the user; accompanied by an indication of their scale.
9. A system for selecting goods and/or services according to body characteristics, comprising: a server providing access to an online retailer; and a body characterising system according to any one of the preceding claims, the server and the body characterising system each comprising respective communication means for communication over a network connection; and in which the server is operable to select from among a plurality of options for goods and/or services offered by the online retailer responsive to some or all of the data descriptive of a user's body transmitted over the network connection to the server by the body characterising system communication means.
10. A system for selecting goods and/or services according to body characteristics in accordance with claim 9, in which the goods and/or services comprise garments.
1 1. A system for selecting goods and/or services according to body characteristics in accordance with claim 9, in which the goods and or services comprise beauty products or styling advice.
12. A system for selecting goods and/or services according to body characteristics in accordance with claim 9, in which the goods and or services comprise one or more selected from the list consisting of: i. transport; ii. furniture; iii. sporting equipment; and iv. seating reservations.
13. A system for selecting goods and/or services according to body characteristics in accordance with claim 9, in which the server comprises: a receiver to receive data descriptive of a user's body; the data comprising one or more selected from the list consisting of: i. one or more isolated images of the user; ii. one or more isolated silhouettes of the user; iii. one or more distance maps of the user; and iv. a full or partial 3D mesh of the user, accompanied by an indication of their scale; and in which the server is operable to generate one or more measurements of the user's body dependent upon the data descriptive of a user's body and the indicated scale.
14. A system for selecting goods and/or services according to body characteristics in accordance with claim 9, in which the server is arranged to administer one or more on-line virtual environments comprising one or more virtual retailers.
15. A method of characterising a body, comprising the steps of: capturing one or more images in an optical imaging device; processing the one or more images to identify an image of a user within each respective image; estimating the scale of the or each image by measuring the distance to the user from the optical imaging device using an optical distance measuring device; determining the per-pixel scale of the image of the user in each respective image; and generating data descriptive of the user's body dependent upon to the identified image of the user and an estimated scale of the identified image of the user.
16. A method according to claim 15, in which the optical distance measuring device comprises: a light emitter; and one or more moveable mirrors; and the method comprises the step of: measuring the path length of light reflected in successive different directions by one or more moveable mirrors, to construct an array of distance measurements.
17. A method according to claim 15, in which the optical distance measuring device comprises: a light pattern projector; and the method comprises the steps of: the light pattern projector projecting a predetermined pattern of light; the optical imaging device capturing one or more images comprising the user illuminated by said pattern of light; and comparing the pattern of light captured in one or more of the captured images with the predetermined pattern, and calculating the distance to illuminated sections of the pattern on the user that is required to account for differences between the captured pattern of light illuminating the user and the predetermined pattern.
18. A method according to claim 17, in which two or more patterns are projected by the light pattern projector.
19. A method according to claim 18, in which the two or more patterns are substantially identical but spatially offset with respect to each other.
20. A method according to any one of claims 15 to 19, in which the data descriptive of the user's body comprises one or more measurements selected from the list consisting of: i. inside leg length; ii. arm length; iii. hip size; iv. waist size;
V. chest size; vi. bust size; and vii. bra cup size.
21. A method according to any one of claims 15 to 20, in which the data descriptive of the user's body comprises one or more selected from the list consisting of: i. one or more isolated images of the user; ii. one or more isolated silhouettes of the user; iii. one or more distance maps of the user; and iv. a full or partial 3D mesh of the user; accompanied by an indication of their scale.
22. A method of selecting goods and/or services according to body characteristics, comprising the steps of: capturing one or more images in an optical imaging device; processing the one or more images to identify an image of a user within each respective image; estimating the scale of the or each image by measuring the distance to the user from the optical imaging device using an optical distance measuring device; determining the per-pixel scale of the image of the user in each respective image; generating data descriptive of the user's body dependent upon to the identified image of the user and an estimated scale of the identified image of the user; transmitting the data descriptive of the user's body to a server providing access to an on-line retailer; and the server selecting from among a plurality of options for goods and/or services offered by the online retailer according to some or all of the data descriptive of a user's body transmitted to the server.
23. A method of selecting goods and/or services according to body characteristics in accordance with claim 22, in which the goods and/or services comprise garments.
24. A method of selecting goods and/or services according to body characteristics in accordance with claim 22, in which the goods and/or services comprise beauty products or styling advice.
25. A method of selecting goods and/or services according to body characteristics in accordance with claim 22, in which the goods and or services comprise one or more selected from the list consisting of: i. transport; ii. furniture; iii. sporting equipment; and iv. seating reservations.
26. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a body characterising system according to any one of claims 1 to 8.
27. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a component of a system for selecting goods and/or services according to body characteristics according to any one of claims 9 to 14.
28. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 15 to 21.
29. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 22 to 25.
30. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a body characterising system according to any one of claims 1 to 8.
31. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a component of a system for selecting goods and/or services according to body characteristics according to any one of claims 9 to 14.
32. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 15 to 21.
33. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 22 to 25.
PCT/GB2008/001464 2007-05-29 2008-04-25 Apparatus and method of body characterisation WO2008145952A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0710198.3 2007-05-29
GB0710198A GB2449648B (en) 2007-05-29 2007-05-29 Apparatus and method of body characterisation

Publications (1)

Publication Number Publication Date
WO2008145952A1 true WO2008145952A1 (en) 2008-12-04

Family

ID=38265472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2008/001464 WO2008145952A1 (en) 2007-05-29 2008-04-25 Apparatus and method of body characterisation

Country Status (2)

Country Link
GB (1) GB2449648B (en)
WO (1) WO2008145952A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010115814A1 (en) * 2009-04-09 2010-10-14 Telefonaktiebolaget L M Ericsson (Publ) Three-dimensional reconstruction of scenes and objects
WO2012095783A1 (en) 2011-01-12 2012-07-19 Koninklijke Philips Electronics N.V. Improved detection of breathing in the bedroom
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
EP2641539A1 (en) * 2012-03-21 2013-09-25 Rocket eleven GmbH Method for determining the dimensions of a body part; method and apparatus for determining the dimensions of a garment
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
NL2010027C2 (en) * 2012-12-20 2014-06-23 Suit Supply B V Method for determining clothing sizes of persons from digital pictures.
US8818883B2 (en) 2009-07-23 2014-08-26 Apple Inc. Personalized shopping avatar
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
CN104622474A (en) * 2015-03-09 2015-05-20 莆田市荔城区聚慧科技咨询有限公司 Surface depth measuring device
WO2016035350A1 (en) * 2014-09-02 2016-03-10 株式会社sizebook Portable information terminal, and control method and control program therefor
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10043068B1 (en) 2010-05-31 2018-08-07 Andrew S. Hansen Body modeling and garment fitting using an electronic device
US10276179B2 (en) 2017-03-06 2019-04-30 Microsoft Technology Licensing, Llc Speech enhancement with low-order non-negative matrix factorization
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
CN110123331A (en) * 2019-04-17 2019-08-16 平安科技(深圳)有限公司 Human body body and constitution collecting method, device and storage medium
US10528147B2 (en) 2017-03-06 2020-01-07 Microsoft Technology Licensing, Llc Ultrasonic based gesture recognition
US10984315B2 (en) 2017-04-28 2021-04-20 Microsoft Technology Licensing, Llc Learning-based noise reduction in data produced by a network of sensors, such as one incorporated into loose-fitting clothing worn by a person

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2948272B1 (en) * 2009-07-27 2011-09-30 Decathlon Sa METHOD AND SYSTEM FOR NON-CONTACT DETERMINATION OF MORPHOLOGICAL DATA OF A SUBJECT
CA2868276A1 (en) * 2011-03-23 2013-09-27 Mgestyk Technologies Inc. Apparatus and system for interfacing with computers and other electronic devices through gestures by using depth sensing and methods of use
WO2013160489A1 (en) * 2012-04-27 2013-10-31 Visiona Control Insdustrial S.L. Method and system for generating and applying three-dimensional reconstructions
EP3162289A1 (en) * 2015-10-26 2017-05-03 Akern S.r.L. A method for estimating antropometric and/or auxologic measurements through digital images

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5953448A (en) * 1996-03-01 1999-09-14 Textile/Clothing Technology Corporation Contour measurement of an object having a discontinuous surface using block point identification techniques
US20020016631A1 (en) * 2000-05-03 2002-02-07 Marchitto Kevin S. Prosthesis and method of making
US6490534B1 (en) * 2000-04-25 2002-12-03 Henry Pfister Camera measurement system
EP1297782A1 (en) * 2001-10-01 2003-04-02 L'oreal Beauty analysis of external body conditions

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0029954B1 (en) * 1979-11-29 1985-08-14 Ulrich M. Landwehr Method and apparatus for ascertaining the dimensions of the human body for clothing purposes by photography
US6307568B1 (en) * 1998-10-28 2001-10-23 Imaginarix Ltd. Virtual dressing over the internet
WO2004008898A1 (en) * 1999-08-27 2004-01-29 Jacob Minsky Method of measuring body measurements for custom apparel manufacturing
US20010034668A1 (en) * 2000-01-29 2001-10-25 Whitworth Brian L. Virtual picture hanging via the internet
US6549639B1 (en) * 2000-05-01 2003-04-15 Genovation Inc. Body part imaging system
KR20050015360A (en) * 2003-08-05 2005-02-21 황후 The electronic shopping which uses the 3D AVATAR system and the method which will drive
US20060149638A1 (en) * 2005-01-06 2006-07-06 Allen Anita L Electronic personalized clothing makeover assistant
JP4473754B2 (en) * 2005-03-11 2010-06-02 株式会社東芝 Virtual fitting device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5953448A (en) * 1996-03-01 1999-09-14 Textile/Clothing Technology Corporation Contour measurement of an object having a discontinuous surface using block point identification techniques
US6490534B1 (en) * 2000-04-25 2002-12-03 Henry Pfister Camera measurement system
US20020016631A1 (en) * 2000-05-03 2002-02-07 Marchitto Kevin S. Prosthesis and method of making
EP1297782A1 (en) * 2001-10-01 2003-04-02 L'oreal Beauty analysis of external body conditions

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8690670B2 (en) 2007-06-14 2014-04-08 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8444486B2 (en) 2007-06-14 2013-05-21 Harmonix Music Systems, Inc. Systems and methods for indicating input actions in a rhythm-action game
US8678895B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for online band matching in a rhythm action game
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
WO2010115814A1 (en) * 2009-04-09 2010-10-14 Telefonaktiebolaget L M Ericsson (Publ) Three-dimensional reconstruction of scenes and objects
US8228367B2 (en) 2009-04-09 2012-07-24 Telefonaktiebolaget Lm Ericsson (Publ) Three-dimensional reconstruction of scenes and objects
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8818883B2 (en) 2009-07-23 2014-08-26 Apple Inc. Personalized shopping avatar
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US8568234B2 (en) 2010-03-16 2013-10-29 Harmonix Music Systems, Inc. Simulating musical instruments
US8636572B2 (en) 2010-03-16 2014-01-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9278286B2 (en) 2010-03-16 2016-03-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US10043068B1 (en) 2010-05-31 2018-08-07 Andrew S. Hansen Body modeling and garment fitting using an electronic device
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
WO2012095783A1 (en) 2011-01-12 2012-07-19 Koninklijke Philips Electronics N.V. Improved detection of breathing in the bedroom
US9993193B2 (en) 2011-01-12 2018-06-12 Koninklijke Philips N.V. Detection of breathing in the bedroom
EP2641539A1 (en) * 2012-03-21 2013-09-25 Rocket eleven GmbH Method for determining the dimensions of a body part; method and apparatus for determining the dimensions of a garment
NL2010027C2 (en) * 2012-12-20 2014-06-23 Suit Supply B V Method for determining clothing sizes of persons from digital pictures.
WO2014098591A1 (en) * 2012-12-20 2014-06-26 Suit Supply B.V. Method for determining clothing sizes of persons from digital pictures
JPWO2016035350A1 (en) * 2014-09-02 2017-06-15 株式会社sizebook Portable information terminal, control method thereof, and control program
WO2016035350A1 (en) * 2014-09-02 2016-03-10 株式会社sizebook Portable information terminal, and control method and control program therefor
CN104622474A (en) * 2015-03-09 2015-05-20 莆田市荔城区聚慧科技咨询有限公司 Surface depth measuring device
US10276179B2 (en) 2017-03-06 2019-04-30 Microsoft Technology Licensing, Llc Speech enhancement with low-order non-negative matrix factorization
US10528147B2 (en) 2017-03-06 2020-01-07 Microsoft Technology Licensing, Llc Ultrasonic based gesture recognition
US10984315B2 (en) 2017-04-28 2021-04-20 Microsoft Technology Licensing, Llc Learning-based noise reduction in data produced by a network of sensors, such as one incorporated into loose-fitting clothing worn by a person
CN110123331A (en) * 2019-04-17 2019-08-16 平安科技(深圳)有限公司 Human body body and constitution collecting method, device and storage medium

Also Published As

Publication number Publication date
GB2449648B (en) 2009-05-06
GB2449648A (en) 2008-12-03
GB0710198D0 (en) 2007-07-04

Similar Documents

Publication Publication Date Title
WO2008145952A1 (en) Apparatus and method of body characterisation
RU2668408C2 (en) Devices, systems and methods of virtualising mirror
US9369638B2 (en) Methods for extracting objects from digital images and for performing color change on the object
US8976160B2 (en) User interface and authentication for a virtual mirror
US8970569B2 (en) Devices, systems and methods of virtualizing a mirror
US8982110B2 (en) Method for image transformation, augmented reality, and teleperence
Giovanni et al. Virtual try-on using kinect and HD camera
US20190311488A1 (en) Method and system for wireless ultra-low footprint body scanning
US20140180647A1 (en) Perceptually guided capture and stylization of 3d human figures
CN105210093B (en) Apparatus, system and method for capturing and displaying appearance
US9098873B2 (en) Motion-based interactive shopping environment
KR101911133B1 (en) Avatar construction using depth camera
US8913809B2 (en) Monitoring physical body changes via image sensor
US20160078663A1 (en) Cloud server body scan data system
US20220188897A1 (en) Methods and systems for determining body measurements and providing clothing size recommendations
US20120095589A1 (en) System and method for 3d shape measurements and for virtual fitting room internet service
US20130179288A1 (en) Collecting and using anthropometric measurements
CN107211165A (en) Devices, systems, and methods for automatically delaying video display
CN109196516A (en) Plant control unit and apparatus control method
JP2020505712A (en) How to create a 3D virtual representation of a person
WO2018182938A1 (en) Method and system for wireless ultra-low footprint body scanning
Dayik et al. Real-time virtual clothes try-on system
BR112016002493B1 (en) METHOD FOR PERFORMING COLOR CHANGE IN AN OBJECT WITHIN A DIGITAL IMAGE

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08737109

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08737109

Country of ref document: EP

Kind code of ref document: A1