WO2012138452A1 - Item model based on descriptor and images - Google Patents

Item model based on descriptor and images Download PDF

Info

Publication number
WO2012138452A1
WO2012138452A1 PCT/US2012/028785 US2012028785W WO2012138452A1 WO 2012138452 A1 WO2012138452 A1 WO 2012138452A1 US 2012028785 W US2012028785 W US 2012028785W WO 2012138452 A1 WO2012138452 A1 WO 2012138452A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
model
product
descriptor
images
Prior art date
Application number
PCT/US2012/028785
Other languages
French (fr)
Inventor
Sajeev PILLAI
Original Assignee
Ebay Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ebay Inc. filed Critical Ebay Inc.
Priority to CA2832227A priority Critical patent/CA2832227C/en
Priority to CN201280022041.XA priority patent/CN103548051B/en
Priority to KR1020127032750A priority patent/KR101420041B1/en
Priority to EP12767928.0A priority patent/EP2695130A4/en
Priority to AU2012240539A priority patent/AU2012240539B2/en
Priority to CN201911127688.9A priority patent/CN110942370B/en
Publication of WO2012138452A1 publication Critical patent/WO2012138452A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Definitions

  • the subject matter disclosed herein generally relates to the processing of data. Specifically, the present disclosure addresses systems and methods of generating an item model based on a descriptor and images.
  • a product may be manufactured by a manufacturer and available for purchase from a seller.
  • the product may take the form of a good, such as a physical item that has a three-dimensional (3D) shape.
  • a product may be a particular model of digital camera or a specific model of a car.
  • the seller may be the same as the man facturer, or the seller may be distinct from the manufacturer.
  • An item may be a specimen (e.g., an individual instance) of the product, and multiple items may constitute multiple specimens of the product. Accordingly, a seller may seek to merchandise one or more items as specimens of the product.
  • the seller may use a network-based system to present information referencing the item to a user of the network-based system (e.g., a potential buyer of the item).
  • network-based systems include commerce systems (e.g., shopping websites), publication systems (e.g., classified advertisement w r ebsites), listing systems (e.g., auction websites), and transaction systems (e.g., payment websites).
  • information referencing the item include a product information document, a product review, a comment concerning the item, a view r item page, a search result, an advertisement, a recommendation, a suggestion, an auction listing, a wish list, or any suitable combination thereof.
  • FIG, i is a conceptual diagram illustrating generation of an item model based on images of the item and on a product model, according to some example embodiments.
  • FIG. 2 is a storyboard diagram illustrating a document, with an image of the item, being superseded by a document with a model viewer showing a 3D model of the item, according to some example embodiments.
  • FIG. 3 is a face view of a user interface of a user application with a model viewer showing a 3D model of the item, according some example embodiments.
  • FIG. 4 is a face vie of a user interface of a seller application configured to facilitate generation of an item model based on a descriptor and images, according to some example embodiments.
  • FIG. 5 is a network diagram illustrating a network environment suitable for generating an item model based on descriptor and images, according to some example embodiments.
  • FIG. 6 is a block diagram illustrating components of a model generation machine, according to some example embodiments.
  • FIG. 7 is a block diagram illustrating components of a generation module within a model generation machine, according to some example embodiments.
  • FIG. 8-10 are flowcharts illustrating operations in a method of generating an item model based on a descriptor and images, according to some example embodiments.
  • FIG, 11 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
  • Example methods and systems are directed to generating an item model based on a descriptor and images. Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
  • a model generation machine may form ail or part of a network-based system.
  • the model generation machine may generate an item model (e.g., a 3D model of the item) based on a set of images of an item and based on a product model (e.g., a 3D mode! of a product of which the item is a specimen).
  • the model generation machine may use the set of images to convert a model of a product to a model of an item.
  • the item may be available for purchase from a seller.
  • the model generation machine may access the set of images, as well as a descriptor of the item.
  • a "descriptor" of an item refers to textual information (e.g., one or more alphanumeric characters) that describes the item.
  • a descriptor of an item may include one or more textual tokens (e.g., one or more words, phrases, strings, or numbers).
  • the model generation machine may identify the product model. Accordingly, the model generation machine may generate the item model from the identified product model and the accessed set of images.
  • the model generation machine receives a 3D model of the product from a manufacturer of the product and stores the 3D model in a product database for access when identifying the product model.
  • the model generation machine may receive a descriptor of the product from the manufacturer of the product and store the descriptor of the product in the product database for access when identifying the product model.
  • the descriptor of the product corresponds to the descriptor of the item and may be stored in the product database as corresponding to the descriptor of the item.
  • the product database may store a descriptor of the product with a reference (e.g., a pointer or an address) to the descriptor of the item.
  • the descriptor of the item may include some or all of the descriptor of the product. Moreover, the descriptor of the item may include an abbreviation, a variation, a nickname, a misspelling, or any suitable combination thereof, of the descriptor of the product. In some example embodiments, the descriptor of the item includes a code that specifies the descriptor of the product (e.g., a color number, a marketing code, or an inventory number). As further examples, the descriptor of the item may include a manufacturer name (e.g., of the product ⁇ , a model name (e.g., of the product), a model year (e.g., of the product), or any suitable combination thereof.
  • a manufacturer name e.g., of the product ⁇
  • a model name e.g., of the product
  • model year e.g., of the product
  • FIG. 1 is a conceptual diagram illustrating generation of an item model 130 based on a set of images 1 10 of the item and on a product model 120, according to some example embodiments.
  • the set of images 1 10 may include one or more images, which may be two- dimensional (2D) graphical images of the item.
  • the set of images 110 includes an image 11 1 of the item (e.g., a car), where the image 1 11 is a left side view of the item.
  • the set of images 1 10 may be a group of photographs of the item taken from various directions relative to the item (e.g., from multiple angles).
  • the set of images 1 10 is specific to the item, and as shown, the image 1 1 1 of the item may depict one or more characteristics (e.g., defects, eustomizations, or anomalies) that are unique to the item (e.g., dents or scratches on the driver's door of the car).
  • the product model 20 is a 3D model of the product of which the item is a specimen.
  • the product may have a 3D shape common to multiple specimens of the product (e.g., common to multiple items), and the product model 120 may include data that is representative of the 3D shape.
  • the product model may include geometric data (e.g., in the form of a set of points in a 3D coordinate space) that define the 3D shape of the product.
  • Such geometric data may be presentable in the form of a set of points, a wireframe model, a polygon model, a texture mapped model, or any suitable combination thereof.
  • the product model 120 is a 3D model of a car, and the car is being presented as a wireframe model.
  • the item model 130 is generated from the set of images 110 and the product model 120. Generation of the item model 130 may be performed by one or more components of a model generation machine. As shown, the item model 130 has the 3D shape of the product model 120, as well as characteristics (e.g., dents or scratches) unique to the item, as depicted in the image 1 11 of the item (e.g., the car). Accordingly, the item model 130 is a 3D model of the item. In other words, the item model 130 is a 3D model of a particular specimen of the product having the 3D shape that is represented in the product model 20.
  • FIG. 2 is a storyboard diagram illustrating a document 2 0, with an image 212 of the item, being superseded by a document 220 with a model viewer 230 showing a 3D model of the item, according to some example embodiments.
  • the documents 210 and 220 may be presented (e.g., sequentially) within a user interface (e.g., a graphical window, a web browser, a document viewer, or a mobile application).
  • a user interface e.g., a graphical window, a web browser, a document viewer, or a mobile application.
  • the documents 210 and 220 may constitute all or part of a web page.
  • the document 210 is presented first.
  • the document 210 includes the image 212 of the item (e.g., the ear), a description 214 of the item, and a control interface 216.
  • the image 212 of the item is a 2D view of the item (e.g., a left side view).
  • the description 214 may include one or more descriptors of the item (e.g., "2016,” “Volkswagen,” “Beetle,” “red,” “leopard interior”).
  • the control interface 216 is operable (e.g., by a user) to initiate presentation of the document 220 (e.g., as a replacement for the document 210).
  • control interface 216 may be a submission control that is operable to submit a request for more information regarding the item (e.g., the car).
  • the request may be a request for the document 220 or for presentation thereof.
  • the control interface 216 is a hyperlink that may be clicked to present the document 220, and the control interface 216 includes text instructions describing operation of the hyperlink (e.g., "3D model available! Click here to view!).
  • the document 220 includes the model viewer 230, which shows a 3D model of the item (e.g., the item model 130).
  • the model viewer 230 may include one or more controls to adjust the presentation of the 3D model of the item.
  • the model viewer 230 may include all or part of a user interface configured to present the 3D model of the item in any of a number of views.
  • the model viewer 230 includes three cursor controls, labeled "rotate,” "zoom,” and "pan.”
  • the model viewer 230 may be configured to perform a rotation of the item model i 30, a zoom of the item mode! 130, a pan of the item model 130, or any suitable combination thereof.
  • the model viewer 230 is present in the document 220 and absent from the document 210.
  • FIG. 3 is a face vie of a user interface 310 of a user application with the model viewer 230 showing a 3D model of the item (e.g., item model 130), according to some example embodiments.
  • the user application may form all or part of user software (e.g., a computer program, a mobile application, an applet, or an app) operable by a user of a model generation machine, a user of a network-based system, or a user of both.
  • the user interface 310 includes a "contact seller" button 312 and a "more info” button 314, in addition to the model viewer 230.
  • the user interface 310 may include one or more descriptors of the item (e.g., "2016,” “Volkswagen,” or “Beetle”).
  • the "contact seller” button 312 is operable (e.g., by the user) to initiate a communication with a seller of the item (e.g., a seller of the car).
  • the "contact seller” button 312 may laimch an email editor, an instant messaging window, a chat client, a text message interface, or any suitable combination thereof.
  • operation of the "contact seller” button 312 initiates a communication that is pre-addressed to the seller (e.g., by mail address, email address, usemame, identifier, or phone number).
  • the "more info” button 314 is operable (e.g., by the user) to initiate presentation of further information regarding the item shown in the model viewer 230 (e.g., information that references the item).
  • the "more info” button 314 may be a hyperlink that is operable to present a product information document that provides detailed specifications for the item.
  • operation of the "more info” button 3 4 may present a view item page maintained by the seller of the item and providing merchandising information about the item.
  • the model viewer 230 may be configured to present the 3D model of the item in any number of views. As such, the model viewer 230 may be configured to respond to one or more cursor inputs (e.g., touchscreen inputs) by manipulating the 3D image of the item (e.g., item model 130) within the model viewer 230.
  • cursor inputs e.g., touchscreen inputs
  • FIG. 4 is a face view of a user interface 410 of a seller application configured to facilitate generation of the item model 130 based on a descriptor and the set of images 1 10, according to some example embodiments.
  • the seller application may form all or part of the seller software (e.g., a computer program, a mobile application, an applet, or app) operable by a seller of an item in usmg a seller device (e.g., a camera-enabled mobile phone) to communicate with a model generation machine, with a network-based system, or with both.
  • the seller software e.g., a computer program, a mobile application, an applet, or app
  • a seller device e.g., a camera-enabled mobile phone
  • the user interface 410 includes an image viewer 420, a "take photo” button 422, a “save photo to set” button 424, an "upload photo set” button 426, a description entry field 430, and an "upload description” button 432.
  • the seller application may be executable by a seller device that includes a camera, and the seller application may be configured to generate the set of images 1 10 using the camera of the seller device.
  • the image viewer 420 displays an image of the item (e.g., image 1 11) as captured by the seller device (e.g., by a camera within or connected to the seller device).
  • the image of the item may be stored temporarily or indefinitely on the seller device (e.g., in a memory card, a cache, or a flash drive). Accordingly, the image viewer 420 may display a saved image or an unsaved image. As shown, the image viewer 420 displays a live image from a camera of the seller device.
  • the "take photo' " button 422 is operable (e.g., by the seller) to save the image shown in the image viewer 420 on the seller device. This may have the effect of mimicking the operation of a camera shutter in taking a photograph. Consequently, one or more activations of the "take photo" button 422 may generate one or more images included in the set of images 110 of the item.
  • the "save photo to set” button 424 is operable (e.g., by the seller) to save the image displayed in the image viewer 420 to a set of images (e.g., save the image 11 1 to the set of images 110),
  • the set of images is stored by the seller device (e.g., a persistent storage location), and operation of the "save photo to set” button 424 initiates storage of the displayed image (e.g., image 1 11) to be stored among the set of images (e.g., set of images 1 10).
  • the "upload photo set" button 426 is operable (e.g., by the seller) to enable access to the set of images (e.g., set of images 110) by a model generation machine, by a network-based system, or by any suitable combination thereof. Enabling access to the set of images may- include transmitting the set of images (e.g., to the model generation machine) or transmitting an authorization to access the set of images.
  • the model generation machine may access (e.g., read) the set of images 110 in response to reception of an authorization to access the set of images 1 10, where the authorization was initiated by activation of the "upload photo set” button 426, As another example, the model generation machine may access (e.g., receive) the set of images 1 10 in response to a transmission of the set of images 110, where the transmission was initiated by activation of the "upload photo set” button 426.
  • the description entry field 430 is operable (e.g., by the seller) to enter one or more descriptors pertinent to the item depicted in the set of items (e.g., set of images 1 10), including the image displayed in the image viewer 420 (e.g., image 1 11).
  • the description entry field 430 may accept text in the form of alphanumeric characters, including numbers, letters, words, phrases, codes, or any suitable combination thereof.
  • the description entry field includes multiple descriptors (e.g., "2016,” “Volkswagen,” “Beetle,” “red exterior,” and “leopard”).
  • the "upload description" button 432 is operable (e.g., by the seller) to enable access to the one or more descriptors by a model generation machine, by a network-based system, or by any suitable combination thereof. Enabling access to the one or more descriptors may include transmitting the one or more descriptors (e.g., to the model generation machine) or transmitting an authorization to access the one or more descriptors.
  • the model generation machine may access (e.g., read) the one or more descriptors in response to reception of an authorization to access the one or more descriptors, where the authorization was initiated by activation of the "upload descriptor" button 432.
  • the model generation machine may access (e.g., receive) the one or more descriptors in response to a transmission of the one or more descriptors, where the transmission was initiated by activation of the "upload description" button 432,
  • FIG. 5 is a network diagram illustrating a network environment 500 suitable for generating the item model 130 based on a descriptor (e.g., "2016 Volkswagen Beetle") and the set of images 1 10, according to some example embodiments.
  • the network environment 500 mcludes a model generation machine 510, a product database 512, an item database 514, a user device 530, and the seller device 550, all communicatively coupled to each other via a network 590.
  • the model generation machine 510, the product database 12, and the item database 514 may form all or part of a network- based commerce system 05.
  • the model generation machine 510 may be implemented in a computer system, as described below with respect to FIG. 11 .
  • a user 532 and a seller 552 may be a human user (e.g., a human being), a machine user (e.g., software program configured to interact with the user device 530), or any suitable combination thereof (e.g., a human assisted by a machine).
  • the user 532 is not part of the network environment 500, but is associated with the user device 530 and may be a user of the user device 530.
  • the user device 530 may be a deskside computer, a tablet computer, or a smart phone belonging to the user 532.
  • the seller 552 is not part of the network environment 500, but is associated with the seller device 550.
  • the seller device 550 may be a tablet computer belonging to the seller 552,
  • the seller device 550 includes a camera or is otherwise capable of generating one or more images (e.g., image 1 11) of the item.
  • any of the machines, databases, or devices shown in FIG. 5 may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine.
  • a computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. I I .
  • a "database" is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database, a triple store, or any suitable combination thereof.
  • an two or more of the machines illustrated in FIG. 5 may be combined into a single machine, and the functions described herein for any single machine may be subdivided among multiple machines.
  • the network 590 may be any network that enables communication between machines (e.g., model generation machine 510). Accordingly, the network 590 may be a wired network, a wireless network, or any suitable combination thereof. The network 590 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
  • FIG. 6 is a block diagram illustrating components of a model generation machine 510, according to some example embodiments.
  • the model generation machine 510 includes an access module 610, an identification module 620, a generation module 630, a communication module 640, and a storage module 650, all configured to communicate with each other (e.g., via a bus, a shared memory, or a switch). Any one or more of these modules may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules.
  • the access module 610 is configured to access the set of images 110 and a descriptor (e.g., "Volkswagen Beetle") of an item.
  • the set of images 1 10 and the descriptor may be provided from the seller device 550.
  • the access module 610 may access the set of images 1 10, the descriptor, or both, by accessing the item database 514, the seller device 550, or any suitable combination thereof.
  • the identification module 620 is configured to identify the product model 120 based on the descriptor of the item.
  • product model 120 is an example of a 3D model of the product of which the item is a specimen.
  • the identification module 620, and identifying the product model 120 may access the product database 512 to access the product model 120, a descriptor (e.g., "Beetle") of the product, or any suitable combination thereof.
  • a descriptor e.g., "Beetle
  • the generation module 630 is configured to generate the item model 130 based on the product model 120 and based on the set of images 1 10.
  • the item model 130 is an example of a 3D model of the item, which may be available for purchase from the seller 552.
  • the generation module 630 may be configured to perform edge detection, image segmentation, background removal, or any suitable combination thereof, upon one or more images (e.g., image 1 1 1 ) from the set of images 110. For example, the generation module 630 may detect an edge of the item depicted in the image 1 1 1. As another example, the generation module 630 may segment the image 11 1 into a foreground portion and a background portion, where the foreground portion depicts the item (e.g., the car ⁇ and the item is absent from the background portion. As a further example, the generation module 630 may remove the background portion of the image from the image (e.g., after segmentation of the image). In one or more of these examples, the generation module 630 may utilize known techniques for segmenta tion of images.
  • the generation module 630 in generating the item model 130, identifies an unusable image within the set of images 1 10 and remo ves the unusable image from the set of images 1 10. For example, after one or more of a detection, image segmentation, or background removal, the generation module 630 may determine that an image depicts an incorrect item (e.g., different from the item depicted in the remainder of the set of images 1 10), a prohibited item (e.g., an item unsupported by the network-based commerce system 505), or no item at all.
  • an incorrect item e.g., different from the item depicted in the remainder of the set of images 1 10
  • a prohibited item e.g., an item unsupported by the network-based commerce system 505
  • the generation module 630 may determine that an image is unsuitable for use in generating the item model 130 due to, for instance, insufficient resolution, low brightness, poor contrast, lack of clarity (e.g., blur), or any suitable combination thereof.
  • the generation module 630 may determine that an image includes prohibited content (e.g., vulgar or obscene text or graphics). Accordingly, the generation module may identify such an image as an unusable image.
  • the generation module 630 in generating the item model 130, identifies a portion of a product model 120 that is to be texture mapped with an image (e.g., image 1 1 1 ) from the set of images 110. Similarly, in generating the item model 130, the generation module 630 may identify' multiple images (e.g., two or more images) from the set of images 1 10 that intersect in an overlapping region of the product model 120, when the multiple images are texture mapped onto the product model 120. The identification of the portion, the multiple images, or any combination thereof, may be based on an analysis (e.g., comparison) of the foreground of the image with the product model 120.
  • an analysis e.g., comparison
  • the generation module 630 texture maps at least some of the set of images 110 onto the product model 120, in generating the i tern model 130. Accordingly, the generation module 630 may include a texture mapping engine. In alternative example embodiments, the texture mapping is performed by a separate texture mapping engine (e.g., within a graphics processor) within the model generation machine or within the user device 530.
  • the generation module 630 is configured to generate the model viewer 230 (e.g., for inclusion in the document 220 or in the user interface 310).
  • the model viewer 230 may be configured to perform a rotation, a zoom, or a pan, or any suitable combination thereof, of the item model 130.
  • the communication module 640 is configured to receive the product model 120 (e.g., from a manufacturer of the product of which the item is a specimen), receive a descriptor (e.g., "Beetle") of the product (e.g., from the manufacturer of the product), or any suitable combination thereof.
  • the communication module 640 provides a user application to the user device 530.
  • the user application may include the user interface 310, which includes the model viewer 230 and is configured to present the model viewer 230 on the user device 530 (e.g., to the user 532).
  • the communication module 640 provides the document 210 to the user device 530.
  • the model viewer 230 is absent from the document 210, though the document 210 includes a descriptor (e.g., "Volkswagen Beetle' " ) of the item, as well as the control interface 216 (e.g., a submission control). Operation of the control interface 216 may cause the communication module 640 to receive a request for the document 220 from the user device 530. In response to the receiving of this request, the communication module 640 may provide the document 220 to the user device 530. As noted above, the document 220 includes the model viewer 230.
  • the communication module 640 receives the set of images 1 10 and the descriptor (e.g., "Volkswagen Beetle") of the item (e.g., from the seller device 550).
  • the communication module 640 may receive the set of images .1 10 as a result of operation of the "upload photo set” button 426, and the communication module 640 may receive the descriptor of the item as a result of operation of the "upload description” button 432.
  • the description of the item and the set of images 110 may be received by the communication module 640 as a submission by the seller 552 of the item.
  • the communication module 640 provides a seller application to the seller device 550.
  • the seller application may include the user interface 410, which may be configured to communicate the set of images 1 10, the descriptor (e.g.,
  • the storage module 650 is configured to store the product model 120, the descriptor (e.g., "Beetle") of the product, or both, in the product database 512 (e.g., for access by the identification module 620).
  • a storage module stores the item model .130 in the item database 514 (e.g., for access by the model generation machine 510, the network-based commerce system 505, the user device 530, the seller device 550, or any suitable combination thereof).
  • the storage module 650 may also store one or more images (e.g., image examples of information that references the item), for access by the access module 610.
  • the storage module 650 may store a descriptor (e.g., one or more descriptors uploaded by the seller 552 using the description entry field 430 of the user interface 410) in the item database 514, as corresponding to the item, for access by the access module 610.
  • a descriptor e.g., one or more descriptors uploaded by the seller 552 using the description entry field 430 of the user interface 410
  • FIG. 7 is a block diagram illustrating modules 710-790 within the generation module 630 of the model generation machine 510, according to some example embodiments.
  • the generation module 630 includes a usability module 710, and edge detection module 720, an image segmentation module 730, a background removal module 740, an overlap identification module 750, a texture mapping module 760, a model viewer module 770, the application module 780, and a web page module 790, all configured to communicate with each other within the generation module 630.
  • the modules 710-790 may each implement one or more of the functionality as described above with respect to the generation module 630.
  • the usability module 710 may be configured to identify an unusable image within a set of images 110, remove the unusable image from the set of images 1 10, or both.
  • identification of the unusable image may include determining that an image (e.g., image 1 1 1) depicts an incorrect item, a prohibited item, or no item at all. This identification may include determining that the image is of poor quality (e.g., has insufficient resolution, low brightness, contrast, or blur) or that the image includes prohibited content.
  • the edge detection module 720 may detect an edge of the item (e.g., the car) depicted in one or more images (e.g., image 111) within the set of images 1 10,
  • the image segmentation module 730 may be configured to segment an image into a foreground portion and a background portion, and the background removal module 740 may be configured to remove the background portion of the image.
  • the overlap identification module 750 identifies two or more images (e.g., image 1 11 ) that overlap each other when texture mapped onto the product model 120, thus intersecting in an overlapping region of the product model 120.
  • a texture mapping module 760 is configured to perform texture mapping of some or all of the set of images 10 onto the product model 120.
  • the model viewer module 770 is configured to generate the model viewer 230.
  • generation of the model viewer 230 includes generating a widget or pop up window configured to present (e.g., display, manipulate, or both) the item model 130.
  • the application module 780 is configured to generate a user application (e.g., for provision by the communication module 640 to the user device 530). Accordingly, the application module 780 may generate the user interface 310.
  • the web page module 790 is configured to generate the document 2 0, the document 220, or both (e.g., for provision by the communication module 640 to the seller device 550). As noted above, one or both of the documents 210 and 220 may be generated as web pages.
  • FIG. 8-10 are flowcharts illustrating operations in a method 800 of generating the item model 130 based on a descriptor (e.g., "Volkswagen Beetle") and the set of images 1 10, according to some example embodiments. Operations of the method 800 may be performed by the mode! generation machine 510, using modules described above with respect to FIG. 6-7.
  • a descriptor e.g., "Volkswagen Beetle”
  • some example embodiments of the method 800 include operations 810, 820, and 830.
  • the access module 610 of the model generation machine 510 accesses the set of images 1 10 and a descriptor (e.g., "Volkswagen Beetle") of the item depicted in the set of images 1 10.
  • a descriptor e.g., "Volkswagen Beetle”
  • the access module 610 may access the set of images 1 10, the descriptor, or both, by accessing the item database 514, the seller device 550, or any suitable combination thereof.
  • the identification module 620 of the model generation machine 510 identifies the product mode! 120 based on the descriptor of the item. For example, the identification module 620 may access the descriptor of the item (e.g., stored in the item database 14), access a descriptor of the product (e.g., stored in the product database 512), and perform a comparison of the two descriptors. Based on the comparison, the identification module 620 may determine that the item is a specimen of the product and identify the product mode! 120 as corresponding to the item.
  • the identification module 620 may access the descriptor of the item (e.g., stored in the item database 14), access a descriptor of the product (e.g., stored in the product database 512), and perform a comparison of the two descriptors. Based on the comparison, the identification module 620 may determine that the item is a specimen of the product and identify the product mode! 120 as corresponding to the item.
  • the generation module 630 of the model generation machine 510 generates the item model 130 based on the product model 120 (e.g., as identified in operation 820) and based on the set of images 1 10 (e.g., as accessed in operation 810). Further details of operation 830, according to some example embodiments, are discussed below with respect to FIG. 10, [0066] As shown in FIG. 9, some example embodiments of the method 800 include one or more of operations 910-984.
  • the communication module 640 of the model generation machine 510 receives the product model 120 from a manufacturer of the product (e.g., from a server machine maintained by the manufacturer).
  • the storage module 650 of the model generation machine 510 stores the product model 120 in the product database 512 (e.g., for access in operation 810).
  • the communication module 640 of the model genera tion machine 510 receives a descriptor of the product from the manufacturer of the product.
  • the storage module 650 of the model generation machine 510 stores the descriptor of the product in the product database 512 (e.g., for access in operation 820).
  • Operation 930 may be executed at any point prior to performance of operation 810.
  • the communication module 640 of the mod el generation machine 510 provid es a seller application to the seller device 550.
  • the seller application may be generated by the application module 780 of the model generation machine 510 prior to performance of operation 930.
  • the seller application may be configured to communicate the set of images 1 10, the descriptor of the item, or both, from the seller device 550 to the network-based commerce system 505 (e.g., to the model generation machine 510).
  • operation 940 may be executed at any point prior to performance of operation 810.
  • the communication module 640 provides a user application to the user device 530.
  • the user application may be generated by the application module 780 prior to performance of operation 940.
  • the user application may be configured to present the model viewer 230 on the user device 530,
  • Operation 950 may be performed as part of operation 820, performed in parallel (e.g., contemporaneously ⁇ with operation 820, performed in response to operation 820, or any suitable combination thereof.
  • the access module 610 of the model generation machine 510 accesses the product model 120 (e.g., by accessing the product database 512). Accordingly, the access module 610 may provide the product model 120 to the generation module 630 (e.g., for use in operation 830 ⁇ .
  • the storage module 650 of the model generation machine 510 stores the item model 130 in the item database 514. This may ha ve the effect of preserving the item model 130 for use in generating the model viewer 230, as described immediately below with respect to operation 970.
  • the generation module 630 of the model generation machine 510 generates the model viewer 230.
  • the model viewer 230 may be generated as a generic model viewer without the item model 130 or generated as a specific model viewer based on (e.g., including) the item model 130. Accordingly, generation of the model viewer 230 may include accessing the item model 130 (e.g., by accessing the item database 514).
  • the communication module 640 of the model generation machine 510 provides the model viewer 230 to the user device 530 (e.g., to a user application executing on the user device 530), For example, the user application may display the user interface 310 on the user device 530, and the codification module 640 may provide the model viewer 230 for inclusion in the user interface 310.
  • the communication module 640 provides the item model 132 the user device 530 (e.g., to the user application executing on the user device 530).
  • the model viewer 230 includes the item model 130, and these operations 972 and 974 may be performed as a single operation.
  • the communication module 640 of the model generation machine 510 provides the document 210 (e.g., a web page without the model viewer 230) to the uses- device 530 (e.g., to a browser executing on the user device 530).
  • the document 210 may include a control interface 216 that is operable to submit a request for information regarding the item. Supposing that the control interface 216 is operated, in operation 982, the communication module 640 receives the request for information regarding the item (e.g., as communicated from the user device 530).
  • the communication module 640 provides the document 220 (e.g., a web page with the model viewer 230) to the user device 530.
  • the item model 130 is included in the model viewer 230
  • the item model 130 is accordingly provided along with the model viewer 230
  • a further operation may be performed by the communication module 640 to provide the item model 132 the user device 530 for inclusion in the model viewer 230.
  • some example embodiments of the method 800 include one or more of operations 1010-1070.
  • the communication module 640 of the model generation machine 510 receives one or more images (e.g., image 1 11 ⁇ from the seller device 550.
  • the one or more images may constitute all or part of the set of images 1 10.
  • operation 1010 may be performed in response to operation of the "upload photo set" button 426 in the user interface 410 of a seller application executing on the seller device 550.
  • the storage module 650 of the model generation machine 510 may store the one or more images in the item database 514 (e.g., for access in operation 810).
  • the communication module 640 receives one or more descriptors of the item from the seller device 550.
  • the one or more descriptors may constitute all or part of a description of the item.
  • operation 1020 may be performed in response to operation of the "upload description" button 432 in the user interface 410 of the seller application executing on the seller device 550.
  • the storage module 650 of the model generation machine 510 may store the one or more descriptors in the item database 514 (e.g., for access in operation 810).
  • One or more of operations 1030-1070 may be included in operation 830, which may be performed by the generation module 630 of the model generation machine 510, as noted above. According to various example embodiments, one or more of the modules described above with respect to FIG. 7 are used to perform one or more of operations 1030-1070,
  • the usability module 710 identifies an unusable image (e.g., image 11 1) among the set of images 1 10. In response to identification of the unusable image, in operation 1032, the usability module 710 may remove the unusable image from the set of images 1 10.
  • an unusable image e.g., image 11 1
  • the usability module 710 may remove the unusable image from the set of images 1 10.
  • the edge detection module 720 detects at least one edge within an image (e.g., image 11 1) among the set of images 1 10. For example, the edge detection module 720 may detect an edge of the item as depicted in the image.
  • the image segmentation module 730 segments the image (e.g., image 11 1 ) into a foreground portion and the background portion. As noted above, the foreground portion may depict the item, and the item may be absent from the background portion.
  • the background removal module 740 removes the background portion of the image (e.g., image 1 11) from the image (e.g., leaving only the foreground portion within the image).
  • the texture mapping module 760 identifies a portion of the product model 120 to be texture mapped with an image (e.g., image 1 1 1 ) from the set of images 1 10.
  • the overlap identification module 750 identifies two or more images (e.g., image 11 1) from the set of images 100 and that intersect in an overlapping region of the product model 120 when texture mapped onto the product model 120.
  • the texture mapping module 760 texture maps at least some of the set of images 1 10 onto the product model 120.
  • the texture mapping module 760 performs the texture mapping based on (e.g., taking into account) the overlapping region identified in operation 1060.
  • one or more of the methodologies described herein may facilitate communication of information about an item available for purchase from a seller.
  • one or more the methodologies described herein may constitute all or part of a business method (e.g., a business method implemented using a machine) that provides a seller with an efficient and convenient way to create a 3D model of the item, that provides a user with an efficient and convenient way to receive 3D information about the item, or any suitable combination thereof.
  • a business method e.g., a business method implemented using a machine
  • one or more the methodologies described herein may have the effect of facilitating a purchase of the item, increasing sales of the product of which the item is a specimen, increasing user attention (e.g., as measured in page views or click through s) on the product, or any suitable combination thereof
  • one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in matching users (e.g., as potential purchasers) with products or specimens thereof that are likely to be of interest. Efforts expended by a user in identifying a product for purchase may be reduced by one or more of the methodologies described herein.
  • Computing resources used by one or more machines, databases, or devices may similarly be reduced. Examples of such computing resources include processor cycles, network traffic, memor)' usage, data storage capacity, power consumption, and cooling capacity.
  • FIG, 11 illustrates components of a machine 1 100, according to some example embodiments, that is able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium ⁇ and perform any one or more of the methodologies discussed herein.
  • a machine-readable medium e.g., a machine-readable storage medium ⁇
  • FIG. 11 shows a diagrammatic representation of the machine 1 100 in the example form of a computer system and within which instructions 1124 (e.g., software) for causing the machine 1100 to perform any one or more of the methodologies discussed herein may be executed.
  • the machine 1100 operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 1100 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine 1 100 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1 124 (sequentially or otherwise) that specify actions to be taken by that machine.
  • the term "machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions 1 124 to perform any one or more of the methodologies discussed herein.
  • the machine 1 100 includes a processor 1102 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1 104, and a static memory 1106, which are configured to communicate with each other via a bus 1 108.
  • the machine 1 100 may further include a graphics display 1 1 10 (e.g., a plasma display panel (PDP), a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT ' )).
  • a graphics display 1 1 10 e.g., a plasma display panel (PDP), a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT ' )
  • the machine 1100 may also include an alphanumeric input device 1 1 12 (e.g., a keyboard), a cursor control device 1 1 14 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 1 1 16, a signal generation device 11 18 (e.g., a speaker), and a network interface device 1120.
  • an alphanumeric input device 1 1 12 e.g., a keyboard
  • a cursor control device 1 1 14 e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument
  • storage unit 1 1 16 e.g., a signal generation device 11 18 (e.g., a speaker)
  • a signal generation device 11 18 e.g., a speaker
  • the storage unit 1 116 includes a machine-readable medium 1 122 on which is stored the instructions 1124 (e.g., software) embodying any one or more of the methodologies or functions described herein.
  • the instructions 1 124 may also reside, completely or at least partially, within the main memory 1 104, within the processor 1 102 (e.g., within the processor's cache memory ⁇ , or both, during execution thereof by the machine 1 100. Accordingly, the main memory 1104 and the processor 1 102 may be considered as machi e-readable media.
  • the instructions 1 124 may be transmitted or received over a etwork 1126 (e.g., network 590) via the network interface device 1 120.
  • the term "memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memosy (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1122 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions (e.g., instructions 1 124).
  • machine-readable medium' shall also be taken to include any medium that is capable of storing instructions (e.g., software) for execution by the machine, such that the instructions, when executed by one or more processors of the machine (e.g. , processor 1102), cause the machine to perform any one or more of the methodologies described herein.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, a data repository in the form of a solid- state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
  • Modules may constitute either software modules (e.g., code embodied on a machine -readable medium or in a transmission signal) or hardware modules.
  • a "hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner.
  • one or more computer systems e.g., a standalone computer system, a client computer system, or a server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically, electronically, or any suitable combination thereof.
  • a hardware module may- include dedicated circuitry or logic that is permanently configured to perform certain operations.
  • a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC.
  • a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
  • a hardware module may include software encompassed within a general -purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the term "hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general -purpose processor configured by software to become a special- purpose processor, the general-purpose processor may be configured as respectively different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled, A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output.
  • Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
  • processor-implemented module refers to a hardware module implemented using one or more processors.
  • the methods described herein may be at least partially processor- implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules.
  • the one or more processors may also operate to support performance of the relevant operations in a "cloud computing" environment or as a "software as a service” (SaaS).
  • SaaS software as a service
  • At least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
  • a network e.g., the Internet
  • API application program interface
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor- implemented modules may be located in a single geographic location (e.g., within a home environment, an office eiivironment, or a server farm).
  • the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as "data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
  • the 3D model of the product including data that is representative of the 3D shape
  • T e met od of description 1 further comprising:
  • the identifying of the 3D model includes accessing the 3D model of the product within the product database.
  • the receiving of the 3D model of the product is from a manufacturer of the product
  • the identifying of the 3D model of the product includes accessing the descriptor of the product within the product database.
  • the receiving of the descriptor of the product is from a manufacturer of the product.
  • the descriptor of the item includes at least one of a manufacturer name of the product, a model name of the product, a model year of the product, the descriptor of the product, an abbreviation of the descriptor of the product, a variation of the descriptor of the product, a nickname of the descriptor of the product, a misspelling of the descriptor of the product, or a code specifying the descriptor of the product.
  • the generating of the 3D model of the item includes identifying an unusable image within a set of images; and removing the unusable image from the set of images.
  • the generating of the 3D model of the item includes at least one of detecting an edge of the item depicted in an image from the set of images, segmenting the image into a foreground portion that depicts the item and a background portion from which the item is absent, or removing the background portion from the image.
  • the generating of the 3D model of the item includes identifying a portion of the 3D model of the product to be texture mapped with an image from the set of images.
  • the generating of the 3D model of the item includes identifying two or more images from the set of images that intersect in an overlapping region when texture mapped onto the 3D model of the product.
  • the generating of the 3D model of the item includes texture mapping at least some of the set of images onto the 3D model of the product.
  • the method of description 12 further comprising:
  • a user application to a user device corresponding to a user of a network-based commerce system, the user application being configured to present the model viewer on the user device,
  • the receiving of the request is resultant from operation of the submission control.
  • the seller application being configured to communicate the set of images and the d escriptor of the item from the seller device to a network-based commerce system.
  • the seller device includes a camera
  • a non- transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • a system comprising:
  • an access module configured to access a set of images of an item and a descriptor of the item, the set of images and the descriptor of the item being provided from a seller device
  • the item being a specimen of a product having a three- dimensional (3D) shape
  • an identification module configured to identify a 3D model of the product based on the descriptor of the item, the 3D model of the product including data that is representative of the 3D shape;
  • a generation module configured to generate a 3D model of the item based on the identified 3D model of the product and based on the set of images, the generation module being implemented using a processor of a machine.
  • a system comprising:

Abstract

A model generation machine may form all or part of a network-based system. The model generation machine may generate an item model (e.g., a 3D model of the item) based on a set of images of an item and based on a product model (e.g., a 3D model of a product of which the item is a specimen). The item may be available for purchase from a seller. The model generation machine may access the set of images, as well as a descriptor of the item. Based on the descriptor, the model generation machine may identify the product model. Accordingly, the model generation machine may generate the item model from the identified product model and the accessed set of images.

Description

ITEM MODEL BASED ON DESCRIPTOR AND IMAGES
RELATED APPLICATION
[0001] This application claims the priority benefit of U.S. Patent Application No.
13/082, 1 10, filed April 7, 201 1 , which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[ΘΘ02] The subject matter disclosed herein generally relates to the processing of data. Specifically, the present disclosure addresses systems and methods of generating an item model based on a descriptor and images.
BACKGROUND
[0003] A product may be manufactured by a manufacturer and available for purchase from a seller. For example, the product may take the form of a good, such as a physical item that has a three-dimensional (3D) shape. For example, a product may be a particular model of digital camera or a specific model of a car. The seller may be the same as the man facturer, or the seller may be distinct from the manufacturer. An item may be a specimen (e.g., an individual instance) of the product, and multiple items may constitute multiple specimens of the product. Accordingly, a seller may seek to merchandise one or more items as specimens of the product.
[0004] In merchandising an item, the seller may use a network-based system to present information referencing the item to a user of the network-based system (e.g., a potential buyer of the item). Examples of network-based systems include commerce systems (e.g., shopping websites), publication systems (e.g., classified advertisement wrebsites), listing systems (e.g., auction websites), and transaction systems (e.g., payment websites). Examples of information referencing the item include a product information document, a product review, a comment concerning the item, a viewr item page, a search result, an advertisement, a recommendation, a suggestion, an auction listing, a wish list, or any suitable combination thereof. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings.
[0006] FIG, i is a conceptual diagram illustrating generation of an item model based on images of the item and on a product model, according to some example embodiments.
[ΘΘ07] FIG. 2 is a storyboard diagram illustrating a document, with an image of the item, being superseded by a document with a model viewer showing a 3D model of the item, according to some example embodiments.
[0008] FIG. 3 is a face view of a user interface of a user application with a model viewer showing a 3D model of the item, according some example embodiments.
[0009] FIG. 4 is a face vie of a user interface of a seller application configured to facilitate generation of an item model based on a descriptor and images, according to some example embodiments.
[ΘΘ10] FIG. 5 is a network diagram illustrating a network environment suitable for generating an item model based on descriptor and images, according to some example embodiments.
[0011] FIG. 6 is a block diagram illustrating components of a model generation machine, according to some example embodiments.
[0012] FIG. 7 is a block diagram illustrating components of a generation module within a model generation machine, according to some example embodiments.
[0013] FIG. 8-10 are flowcharts illustrating operations in a method of generating an item model based on a descriptor and images, according to some example embodiments.
[0014] FIG, 11 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein. DETAILED DESCRIPTION
[0015] Example methods and systems are directed to generating an item model based on a descriptor and images. Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
[0016] A model generation machine may form ail or part of a network-based system. The model generation machine may generate an item model (e.g., a 3D model of the item) based on a set of images of an item and based on a product model (e.g., a 3D mode! of a product of which the item is a specimen). In other words, the model generation machine may use the set of images to convert a model of a product to a model of an item. The item may be available for purchase from a seller. The model generation machine may access the set of images, as well as a descriptor of the item. As used herein, a "descriptor" of an item refers to textual information (e.g., one or more alphanumeric characters) that describes the item. A descriptor of an item may include one or more textual tokens (e.g., one or more words, phrases, strings, or numbers). Based on the descriptor of the item, the model generation machine may identify the product model. Accordingly, the model generation machine may generate the item model from the identified product model and the accessed set of images.
[0017] In some example embodiments, the model generation machine receives a 3D model of the product from a manufacturer of the product and stores the 3D model in a product database for access when identifying the product model. Similarly, the model generation machine may receive a descriptor of the product from the manufacturer of the product and store the descriptor of the product in the product database for access when identifying the product model.
[0018] The descriptor of the product corresponds to the descriptor of the item and may be stored in the product database as corresponding to the descriptor of the item. For example, the product database may store a descriptor of the product with a reference (e.g., a pointer or an address) to the descriptor of the item.
[0019] The descriptor of the item may include some or all of the descriptor of the product. Moreover, the descriptor of the item may include an abbreviation, a variation, a nickname, a misspelling, or any suitable combination thereof, of the descriptor of the product. In some example embodiments, the descriptor of the item includes a code that specifies the descriptor of the product (e.g., a color number, a marketing code, or an inventory number). As further examples, the descriptor of the item may include a manufacturer name (e.g., of the product}, a model name (e.g., of the product), a model year (e.g., of the product), or any suitable combination thereof.
|ΘΘ20] FIG. 1 is a conceptual diagram illustrating generation of an item model 130 based on a set of images 1 10 of the item and on a product model 120, according to some example embodiments. The set of images 1 10 may include one or more images, which may be two- dimensional (2D) graphical images of the item. As shown, the set of images 110 includes an image 11 1 of the item (e.g., a car), where the image 1 11 is a left side view of the item.
Accordingly, the set of images 1 10 may be a group of photographs of the item taken from various directions relative to the item (e.g., from multiple angles). The set of images 1 10 is specific to the item, and as shown, the image 1 1 1 of the item may depict one or more characteristics (e.g., defects, eustomizations, or anomalies) that are unique to the item (e.g., dents or scratches on the driver's door of the car).
[0021] The product model 20 is a 3D model of the product of which the item is a specimen. In other words, the product may have a 3D shape common to multiple specimens of the product (e.g., common to multiple items), and the product model 120 may include data that is representative of the 3D shape. For example, the product model may include geometric data (e.g., in the form of a set of points in a 3D coordinate space) that define the 3D shape of the product. Such geometric data may be presentable in the form of a set of points, a wireframe model, a polygon model, a texture mapped model, or any suitable combination thereof. As shown, the product model 120 is a 3D model of a car, and the car is being presented as a wireframe model.
[0022] The item model 130 is generated from the set of images 110 and the product model 120. Generation of the item model 130 may be performed by one or more components of a model generation machine. As shown, the item model 130 has the 3D shape of the product model 120, as well as characteristics (e.g., dents or scratches) unique to the item, as depicted in the image 1 11 of the item (e.g., the car). Accordingly, the item model 130 is a 3D model of the item. In other words, the item model 130 is a 3D model of a particular specimen of the product having the 3D shape that is represented in the product model 20.
[0023] FIG. 2 is a storyboard diagram illustrating a document 2 0, with an image 212 of the item, being superseded by a document 220 with a model viewer 230 showing a 3D model of the item, according to some example embodiments. In some example embodiments, the documents 210 and 220 may be presented (e.g., sequentially) within a user interface (e.g., a graphical window, a web browser, a document viewer, or a mobile application). For example, one or both of the documents 210 and 220 may constitute all or part of a web page.
[0024] As shown, the document 210 is presented first. The document 210 includes the image 212 of the item (e.g., the ear), a description 214 of the item, and a control interface 216. The image 212 of the item is a 2D view of the item (e.g., a left side view). The description 214 may include one or more descriptors of the item (e.g., "2016," "Volkswagen," "Beetle," "red," "leopard interior"). The control interface 216 is operable (e.g., by a user) to initiate presentation of the document 220 (e.g., as a replacement for the document 210). Accordingly, the control interface 216 may be a submission control that is operable to submit a request for more information regarding the item (e.g., the car). For example, the request may be a request for the document 220 or for presentation thereof. As shown, the control interface 216 is a hyperlink that may be clicked to present the document 220, and the control interface 216 includes text instructions describing operation of the hyperlink (e.g., "3D model available! Click here to view!").
[0025] As indicated by a curved arrow, the document 220 is presented next. The document 220 includes the model viewer 230, which shows a 3D model of the item (e.g., the item model 130). The model viewer 230 may include one or more controls to adjust the presentation of the 3D model of the item. In other words, the model viewer 230 may include all or part of a user interface configured to present the 3D model of the item in any of a number of views. For example, as shown, the model viewer 230 includes three cursor controls, labeled "rotate," "zoom," and "pan." Accordingly, the model viewer 230 may be configured to perform a rotation of the item model i 30, a zoom of the item mode! 130, a pan of the item model 130, or any suitable combination thereof. As shown, the model viewer 230 is present in the document 220 and absent from the document 210.
[0026] FIG. 3 is a face vie of a user interface 310 of a user application with the model viewer 230 showing a 3D model of the item (e.g., item model 130), according to some example embodiments. The user application may form all or part of user software (e.g., a computer program, a mobile application, an applet, or an app) operable by a user of a model generation machine, a user of a network-based system, or a user of both. The user interface 310 includes a "contact seller" button 312 and a "more info" button 314, in addition to the model viewer 230. In addition, the user interface 310 may include one or more descriptors of the item (e.g., "2016," "Volkswagen," or "Beetle"). [0027] The "contact seller" button 312 is operable (e.g., by the user) to initiate a communication with a seller of the item (e.g., a seller of the car). For example, the "contact seller" button 312 may laimch an email editor, an instant messaging window, a chat client, a text message interface, or any suitable combination thereof. To some example embodiments, operation of the "contact seller" button 312 initiates a communication that is pre-addressed to the seller (e.g., by mail address, email address, usemame, identifier, or phone number).
[0028] The "more info" button 314 is operable (e.g., by the user) to initiate presentation of further information regarding the item shown in the model viewer 230 (e.g., information that references the item). For example, the "more info" button 314 may be a hyperlink that is operable to present a product information document that provides detailed specifications for the item. As another example, operation of the "more info" button 3 4 may present a view item page maintained by the seller of the item and providing merchandising information about the item.
[0029] As noted above, the model viewer 230 may be configured to present the 3D model of the item in any number of views. As such, the model viewer 230 may be configured to respond to one or more cursor inputs (e.g., touchscreen inputs) by manipulating the 3D image of the item (e.g., item model 130) within the model viewer 230.
[0030] FIG. 4 is a face view of a user interface 410 of a seller application configured to facilitate generation of the item model 130 based on a descriptor and the set of images 1 10, according to some example embodiments. The seller application may form all or part of the seller software (e.g., a computer program, a mobile application, an applet, or app) operable by a seller of an item in usmg a seller device (e.g., a camera-enabled mobile phone) to communicate with a model generation machine, with a network-based system, or with both. The user interface 410 includes an image viewer 420, a "take photo" button 422, a "save photo to set" button 424, an "upload photo set" button 426, a description entry field 430, and an "upload description" button 432. The seller application may be executable by a seller device that includes a camera, and the seller application may be configured to generate the set of images 1 10 using the camera of the seller device.
[0031] The image viewer 420 displays an image of the item (e.g., image 1 11) as captured by the seller device (e.g., by a camera within or connected to the seller device). The image of the item may be stored temporarily or indefinitely on the seller device (e.g., in a memory card, a cache, or a flash drive). Accordingly, the image viewer 420 may display a saved image or an unsaved image. As shown, the image viewer 420 displays a live image from a camera of the seller device. [0032] The "take photo'" button 422 is operable (e.g., by the seller) to save the image shown in the image viewer 420 on the seller device. This may have the effect of mimicking the operation of a camera shutter in taking a photograph. Consequently, one or more activations of the "take photo" button 422 may generate one or more images included in the set of images 110 of the item.
[0033] The "save photo to set" button 424 is operable (e.g., by the seller) to save the image displayed in the image viewer 420 to a set of images (e.g., save the image 11 1 to the set of images 110), In some example embodiments, the set of images is stored by the seller device (e.g., a persistent storage location), and operation of the "save photo to set" button 424 initiates storage of the displayed image (e.g., image 1 11) to be stored among the set of images (e.g., set of images 1 10).
[0034] The "upload photo set" button 426 is operable (e.g., by the seller) to enable access to the set of images (e.g., set of images 110) by a model generation machine, by a network-based system, or by any suitable combination thereof. Enabling access to the set of images may- include transmitting the set of images (e.g., to the model generation machine) or transmitting an authorization to access the set of images. For example, the model generation machine may access (e.g., read) the set of images 110 in response to reception of an authorization to access the set of images 1 10, where the authorization was initiated by activation of the "upload photo set" button 426, As another example, the model generation machine may access (e.g., receive) the set of images 1 10 in response to a transmission of the set of images 110, where the transmission was initiated by activation of the "upload photo set" button 426.
[0035] The description entry field 430 is operable (e.g., by the seller) to enter one or more descriptors pertinent to the item depicted in the set of items (e.g., set of images 1 10), including the image displayed in the image viewer 420 (e.g., image 1 11). The description entry field 430 may accept text in the form of alphanumeric characters, including numbers, letters, words, phrases, codes, or any suitable combination thereof. As shown, the description entry field includes multiple descriptors (e.g., "2016," "Volkswagen," "Beetle," "red exterior," and "leopard").
[0036] The "upload description" button 432 is operable (e.g., by the seller) to enable access to the one or more descriptors by a model generation machine, by a network-based system, or by any suitable combination thereof. Enabling access to the one or more descriptors may include transmitting the one or more descriptors (e.g., to the model generation machine) or transmitting an authorization to access the one or more descriptors. As an example, the model generation machine may access (e.g., read) the one or more descriptors in response to reception of an authorization to access the one or more descriptors, where the authorization was initiated by activation of the "upload descriptor" button 432. As another example, the model generation machine may access (e.g., receive) the one or more descriptors in response to a transmission of the one or more descriptors, where the transmission was initiated by activation of the "upload description" button 432,
10037] FIG. 5 is a network diagram illustrating a network environment 500 suitable for generating the item model 130 based on a descriptor (e.g., "2016 Volkswagen Beetle") and the set of images 1 10, according to some example embodiments. The network environment 500 mcludes a model generation machine 510, a product database 512, an item database 514, a user device 530, and the seller device 550, all communicatively coupled to each other via a network 590. As shown, the model generation machine 510, the product database 12, and the item database 514 may form all or part of a network- based commerce system 05. The model generation machine 510 may be implemented in a computer system, as described below with respect to FIG. 11 .
[0038] Also shown in FIG. 5 are a user 532 and a seller 552. One or both of the user 532 and the seller 552 may be a human user (e.g., a human being), a machine user (e.g., software program configured to interact with the user device 530), or any suitable combination thereof (e.g., a human assisted by a machine). The user 532 is not part of the network environment 500, but is associated with the user device 530 and may be a user of the user device 530. For example, the user device 530 may be a deskside computer, a tablet computer, or a smart phone belonging to the user 532. Similarly, the seller 552 is not part of the network environment 500, but is associated with the seller device 550. As an example, the seller device 550 may be a tablet computer belonging to the seller 552, According to various example embodiments, the seller device 550 includes a camera or is otherwise capable of generating one or more images (e.g., image 1 11) of the item.
[0039] Any of the machines, databases, or devices shown in FIG. 5 may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine. For example, a computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. I I , As used herein, a "database" is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database, a triple store, or any suitable combination thereof. Moreover, an two or more of the machines illustrated in FIG. 5 may be combined into a single machine, and the functions described herein for any single machine may be subdivided among multiple machines. [0040] The network 590 may be any network that enables communication between machines (e.g., model generation machine 510). Accordingly, the network 590 may be a wired network, a wireless network, or any suitable combination thereof. The network 590 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
10041] FIG. 6 is a block diagram illustrating components of a model generation machine 510, according to some example embodiments. The model generation machine 510 includes an access module 610, an identification module 620, a generation module 630, a communication module 640, and a storage module 650, all configured to communicate with each other (e.g., via a bus, a shared memory, or a switch). Any one or more of these modules may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules.
10042] The access module 610 is configured to access the set of images 110 and a descriptor (e.g., "Volkswagen Beetle") of an item. The set of images 1 10 and the descriptor may be provided from the seller device 550. The access module 610 may access the set of images 1 10, the descriptor, or both, by accessing the item database 514, the seller device 550, or any suitable combination thereof.
[0043] The identification module 620 is configured to identify the product model 120 based on the descriptor of the item. As noted above, product model 120 is an example of a 3D model of the product of which the item is a specimen. The identification module 620, and identifying the product model 120, may access the product database 512 to access the product model 120, a descriptor (e.g., "Beetle") of the product, or any suitable combination thereof.
[0044] The generation module 630 is configured to generate the item model 130 based on the product model 120 and based on the set of images 1 10. As noted above, the item model 130 is an example of a 3D model of the item, which may be available for purchase from the seller 552.
[0045] The generation module 630 may be configured to perform edge detection, image segmentation, background removal, or any suitable combination thereof, upon one or more images (e.g., image 1 1 1 ) from the set of images 110. For example, the generation module 630 may detect an edge of the item depicted in the image 1 1 1. As another example, the generation module 630 may segment the image 11 1 into a foreground portion and a background portion, where the foreground portion depicts the item (e.g., the car} and the item is absent from the background portion. As a further example, the generation module 630 may remove the background portion of the image from the image (e.g., after segmentation of the image). In one or more of these examples, the generation module 630 may utilize known techniques for segmenta tion of images.
[ΘΘ46] In some example embodiments, the generation module 630, in generating the item model 130, identifies an unusable image within the set of images 1 10 and remo ves the unusable image from the set of images 1 10. For example, after one or more of a detection, image segmentation, or background removal, the generation module 630 may determine that an image depicts an incorrect item (e.g., different from the item depicted in the remainder of the set of images 1 10), a prohibited item (e.g., an item unsupported by the network-based commerce system 505), or no item at all. As another example, the generation module 630 may determine that an image is unsuitable for use in generating the item model 130 due to, for instance, insufficient resolution, low brightness, poor contrast, lack of clarity (e.g., blur), or any suitable combination thereof. As a further example, the generation module 630 may determine that an image includes prohibited content (e.g., vulgar or obscene text or graphics). Accordingly, the generation module may identify such an image as an unusable image.
[0047] In certain example embodiments, the generation module 630, in generating the item model 130, identifies a portion of a product model 120 that is to be texture mapped with an image (e.g., image 1 1 1 ) from the set of images 110. Similarly, in generating the item model 130, the generation module 630 may identify' multiple images (e.g., two or more images) from the set of images 1 10 that intersect in an overlapping region of the product model 120, when the multiple images are texture mapped onto the product model 120. The identification of the portion, the multiple images, or any combination thereof, may be based on an analysis (e.g., comparison) of the foreground of the image with the product model 120.
[0048] In various example embodiments, the generation module 630 texture maps at least some of the set of images 110 onto the product model 120, in generating the i tern model 130. Accordingly, the generation module 630 may include a texture mapping engine. In alternative example embodiments, the texture mapping is performed by a separate texture mapping engine (e.g., within a graphics processor) within the model generation machine or within the user device 530.
[0049] Furthermore, according to some example embodiments, the generation module 630 is configured to generate the model viewer 230 (e.g., for inclusion in the document 220 or in the user interface 310). As noted above, the model viewer 230 may be configured to perform a rotation, a zoom, or a pan, or any suitable combination thereof, of the item model 130. [0050] The communication module 640 is configured to receive the product model 120 (e.g., from a manufacturer of the product of which the item is a specimen), receive a descriptor (e.g., "Beetle") of the product (e.g., from the manufacturer of the product), or any suitable combination thereof. According to certain example embodiments, the communication module 640 provides a user application to the user device 530. The user application may include the user interface 310, which includes the model viewer 230 and is configured to present the model viewer 230 on the user device 530 (e.g., to the user 532).
[0051] In some example embodiments, the communication module 640 provides the document 210 to the user device 530. As noted above, the model viewer 230 is absent from the document 210, though the document 210 includes a descriptor (e.g., "Volkswagen Beetle'") of the item, as well as the control interface 216 (e.g., a submission control). Operation of the control interface 216 may cause the communication module 640 to receive a request for the document 220 from the user device 530. In response to the receiving of this request, the communication module 640 may provide the document 220 to the user device 530. As noted above, the document 220 includes the model viewer 230.
[0052] In certain example embodiments, the communication module 640 receives the set of images 1 10 and the descriptor (e.g., "Volkswagen Beetle") of the item (e.g., from the seller device 550). For example, the communication module 640 may receive the set of images .1 10 as a result of operation of the "upload photo set" button 426, and the communication module 640 may receive the descriptor of the item as a result of operation of the "upload description" button 432. In other words, the description of the item and the set of images 110 may be received by the communication module 640 as a submission by the seller 552 of the item.
[0053] In various example embodiments, the communication module 640 provides a seller application to the seller device 550. The seller application may include the user interface 410, which may be configured to communicate the set of images 1 10, the descriptor (e.g.,
"Volkswagen Beetle") of the item, or both, to the model generation machine 510, the network- based commerce system 505, or any suitable combination thereof.
[0054] The storage module 650 is configured to store the product model 120, the descriptor (e.g., "Beetle") of the product, or both, in the product database 512 (e.g., for access by the identification module 620). In some example embodiments, a storage module stores the item model .130 in the item database 514 (e.g., for access by the model generation machine 510, the network-based commerce system 505, the user device 530, the seller device 550, or any suitable combination thereof). The storage module 650 may also store one or more images (e.g., image examples of information that references the item), for access by the access module 610.
Similarly, the storage module 650 may store a descriptor (e.g., one or more descriptors uploaded by the seller 552 using the description entry field 430 of the user interface 410) in the item database 514, as corresponding to the item, for access by the access module 610.
[ΘΘ55] FIG. 7 is a block diagram illustrating modules 710-790 within the generation module 630 of the model generation machine 510, according to some example embodiments. As shown, the generation module 630 includes a usability module 710, and edge detection module 720, an image segmentation module 730, a background removal module 740, an overlap identification module 750, a texture mapping module 760, a model viewer module 770, the application module 780, and a web page module 790, all configured to communicate with each other within the generation module 630. The modules 710-790 may each implement one or more of the functionality as described above with respect to the generation module 630.
[ΘΘ56] For example, the usability module 710 may be configured to identify an unusable image within a set of images 110, remove the unusable image from the set of images 1 10, or both. As noted above, identification of the unusable image may include determining that an image (e.g., image 1 1 1) depicts an incorrect item, a prohibited item, or no item at all. This identification may include determining that the image is of poor quality (e.g., has insufficient resolution, low brightness, contrast, or blur) or that the image includes prohibited content.
[0057] The edge detection module 720 may detect an edge of the item (e.g., the car) depicted in one or more images (e.g., image 111) within the set of images 1 10, The image segmentation module 730 may be configured to segment an image into a foreground portion and a background portion, and the background removal module 740 may be configured to remove the background portion of the image.
[0058] The overlap identification module 750 identifies two or more images (e.g., image 1 11 ) that overlap each other when texture mapped onto the product model 120, thus intersecting in an overlapping region of the product model 120. A texture mapping module 760 is configured to perform texture mapping of some or all of the set of images 10 onto the product model 120.
[0059] The model viewer module 770 is configured to generate the model viewer 230. In some example embodiments, generation of the model viewer 230 includes generating a widget or pop up window configured to present (e.g., display, manipulate, or both) the item model 130. [0060] The application module 780 is configured to generate a user application (e.g., for provision by the communication module 640 to the user device 530). Accordingly, the application module 780 may generate the user interface 310.
[0061] The web page module 790 is configured to generate the document 2 0, the document 220, or both (e.g., for provision by the communication module 640 to the seller device 550). As noted above, one or both of the documents 210 and 220 may be generated as web pages.
[0062] FIG. 8-10 are flowcharts illustrating operations in a method 800 of generating the item model 130 based on a descriptor (e.g., "Volkswagen Beetle") and the set of images 1 10, according to some example embodiments. Operations of the method 800 may be performed by the mode! generation machine 510, using modules described above with respect to FIG. 6-7.
[0063] As shown in FIG. 8, some example embodiments of the method 800 include operations 810, 820, and 830. In operation 810, the access module 610 of the model generation machine 510 accesses the set of images 1 10 and a descriptor (e.g., "Volkswagen Beetle") of the item depicted in the set of images 1 10. For example, the access module 610 may access the set of images 1 10, the descriptor, or both, by accessing the item database 514, the seller device 550, or any suitable combination thereof.
[0064] In operation 820, the identification module 620 of the model generation machine 510 identifies the product mode! 120 based on the descriptor of the item. For example, the identification module 620 may access the descriptor of the item (e.g., stored in the item database 14), access a descriptor of the product (e.g., stored in the product database 512), and perform a comparison of the two descriptors. Based on the comparison, the identification module 620 may determine that the item is a specimen of the product and identify the product mode! 120 as corresponding to the item.
[0065] In operation 830, the generation module 630 of the model generation machine 510 generates the item model 130 based on the product model 120 (e.g., as identified in operation 820) and based on the set of images 1 10 (e.g., as accessed in operation 810). Further details of operation 830, according to some example embodiments, are discussed below with respect to FIG. 10, [0066] As shown in FIG. 9, some example embodiments of the method 800 include one or more of operations 910-984. In operation 910, the communication module 640 of the model generation machine 510 receives the product model 120 from a manufacturer of the product (e.g., from a server machine maintained by the manufacturer). In operation 912, the storage module 650 of the model generation machine 510 stores the product model 120 in the product database 512 (e.g., for access in operation 810).
[0067] Similarly, in operation 920, the communication module 640 of the model genera tion machine 510 receives a descriptor of the product from the manufacturer of the product. Likewise, in operation 922, the storage module 650 of the model generation machine 510 stores the descriptor of the product in the product database 512 (e.g., for access in operation 820).
[0068] Operation 930 may be executed at any point prior to performance of operation 810. In operation 930, the communication module 640 of the mod el generation machine 510 provid es a seller application to the seller device 550. The seller application may be generated by the application module 780 of the model generation machine 510 prior to performance of operation 930. As noted above, the seller application may be configured to communicate the set of images 1 10, the descriptor of the item, or both, from the seller device 550 to the network-based commerce system 505 (e.g., to the model generation machine 510).
[0069] In a similar fashion, operation 940 may be executed at any point prior to performance of operation 810. In operation 940, the communication module 640 provides a user application to the user device 530. The user application may be generated by the application module 780 prior to performance of operation 940. As noted above, the user application may be configured to present the model viewer 230 on the user device 530,
[0070] Operation 950 may be performed as part of operation 820, performed in parallel (e.g., contemporaneously} with operation 820, performed in response to operation 820, or any suitable combination thereof. In operation 950, the access module 610 of the model generation machine 510 accesses the product model 120 (e.g., by accessing the product database 512). Accordingly, the access module 610 may provide the product model 120 to the generation module 630 (e.g., for use in operation 830}.
[0071] In operation 960, the storage module 650 of the model generation machine 510 stores the item model 130 in the item database 514. This may ha ve the effect of preserving the item model 130 for use in generating the model viewer 230, as described immediately below with respect to operation 970. [0072] In operation 970, the generation module 630 of the model generation machine 510 generates the model viewer 230. The model viewer 230 may be generated as a generic model viewer without the item model 130 or generated as a specific model viewer based on (e.g., including) the item model 130. Accordingly, generation of the model viewer 230 may include accessing the item model 130 (e.g., by accessing the item database 514).
10073] In operation 972, the communication module 640 of the model generation machine 510 provides the model viewer 230 to the user device 530 (e.g., to a user application executing on the user device 530), For example, the user application may display the user interface 310 on the user device 530, and the codification module 640 may provide the model viewer 230 for inclusion in the user interface 310. In operation 974, the communication module 640 provides the item model 132 the user device 530 (e.g., to the user application executing on the user device 530). Some example embodiments, the model viewer 230 includes the item model 130, and these operations 972 and 974 may be performed as a single operation.
10074] In operation 980, the communication module 640 of the model generation machine 510 provides the document 210 (e.g., a web page without the model viewer 230) to the uses- device 530 (e.g., to a browser executing on the user device 530). As noted above, the document 210 may include a control interface 216 that is operable to submit a request for information regarding the item. Supposing that the control interface 216 is operated, in operation 982, the communication module 640 receives the request for information regarding the item (e.g., as communicated from the user device 530). In operation 984, the communication module 640 provides the document 220 (e.g., a web page with the model viewer 230) to the user device 530. In example embodiments where the item model 130 is included in the model viewer 230, the item model 130 is accordingly provided along with the model viewer 230, An alternative example embodiments where the item model 130 is not included in the model viewer 230, a further operation may be performed by the communication module 640 to provide the item model 132 the user device 530 for inclusion in the model viewer 230.
[0075] As shown in FIG. 10, some example embodiments of the method 800 include one or more of operations 1010-1070. In operation 1010, the communication module 640 of the model generation machine 510 receives one or more images (e.g., image 1 11 } from the seller device 550. The one or more images may constitute all or part of the set of images 1 10. For example, operation 1010 may be performed in response to operation of the "upload photo set" button 426 in the user interface 410 of a seller application executing on the seller device 550. In a further operation, the storage module 650 of the model generation machine 510 may store the one or more images in the item database 514 (e.g., for access in operation 810). [0076] Similarly, in operation 1020, the communication module 640 receives one or more descriptors of the item from the seller device 550. The one or more descriptors may constitute all or part of a description of the item. For example, operation 1020 may be performed in response to operation of the "upload description" button 432 in the user interface 410 of the seller application executing on the seller device 550. In a further operation, the storage module 650 of the model generation machine 510 may store the one or more descriptors in the item database 514 (e.g., for access in operation 810).
[0077] One or more of operations 1030-1070 may be included in operation 830, which may be performed by the generation module 630 of the model generation machine 510, as noted above. According to various example embodiments, one or more of the modules described above with respect to FIG. 7 are used to perform one or more of operations 1030-1070,
[0078] In operation 1030, the usability module 710 identifies an unusable image (e.g., image 11 1) among the set of images 1 10. In response to identification of the unusable image, in operation 1032, the usability module 710 may remove the unusable image from the set of images 1 10.
[0079] In operation 1040, the edge detection module 720 detects at least one edge within an image (e.g., image 11 1) among the set of images 1 10. For example, the edge detection module 720 may detect an edge of the item as depicted in the image. In operation 1042, the image segmentation module 730 segments the image (e.g., image 11 1 ) into a foreground portion and the background portion. As noted above, the foreground portion may depict the item, and the item may be absent from the background portion. In operation 1044, the background removal module 740 removes the background portion of the image (e.g., image 1 11) from the image (e.g., leaving only the foreground portion within the image).
[0080] In operation 1050, the texture mapping module 760 identifies a portion of the product model 120 to be texture mapped with an image (e.g., image 1 1 1 ) from the set of images 1 10. In operation 1060, the overlap identification module 750 identifies two or more images (e.g., image 11 1) from the set of images 100 and that intersect in an overlapping region of the product model 120 when texture mapped onto the product model 120. In operation 1070, the texture mapping module 760 texture maps at least some of the set of images 1 10 onto the product model 120. In some example embodiments, the texture mapping module 760 performs the texture mapping based on (e.g., taking into account) the overlapping region identified in operation 1060. [0081] According to various example embodiments, one or more of the methodologies described herein may facilitate communication of information about an item available for purchase from a seller. In particular, one or more the methodologies described herein may constitute all or part of a business method (e.g., a business method implemented using a machine) that provides a seller with an efficient and convenient way to create a 3D model of the item, that provides a user with an efficient and convenient way to receive 3D information about the item, or any suitable combination thereof. Accordingly, one or more the methodologies described herein may have the effect of facilitating a purchase of the item, increasing sales of the product of which the item is a specimen, increasing user attention (e.g., as measured in page views or click through s) on the product, or any suitable combination thereof
[0082] When these effects are considered in aggregate, one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in matching users (e.g., as potential purchasers) with products or specimens thereof that are likely to be of interest. Efforts expended by a user in identifying a product for purchase may be reduced by one or more of the methodologies described herein. Computing resources used by one or more machines, databases, or devices (e.g., within the network environment 500) may similarly be reduced. Examples of such computing resources include processor cycles, network traffic, memor)' usage, data storage capacity, power consumption, and cooling capacity.
[0083] FIG, 11 illustrates components of a machine 1 100, according to some example embodiments, that is able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium} and perform any one or more of the methodologies discussed herein. Specifically, FIG. 11 shows a diagrammatic representation of the machine 1 100 in the example form of a computer system and within which instructions 1124 (e.g., software) for causing the machine 1100 to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine 1100 operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1100 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 1 100 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1 124 (sequentially or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include a collection of machines that individually or jointly execute the instructions 1 124 to perform any one or more of the methodologies discussed herein.
[0084] The machine 1 100 includes a processor 1102 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1 104, and a static memory 1106, which are configured to communicate with each other via a bus 1 108. The machine 1 100 may further include a graphics display 1 1 10 (e.g., a plasma display panel (PDP), a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT')). The machine 1100 may also include an alphanumeric input device 1 1 12 (e.g., a keyboard), a cursor control device 1 1 14 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 1 1 16, a signal generation device 11 18 (e.g., a speaker), and a network interface device 1120.
[ΘΘ85] The storage unit 1 116 includes a machine-readable medium 1 122 on which is stored the instructions 1124 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 1 124 may also reside, completely or at least partially, within the main memory 1 104, within the processor 1 102 (e.g., within the processor's cache memory}, or both, during execution thereof by the machine 1 100. Accordingly, the main memory 1104 and the processor 1 102 may be considered as machi e-readable media. The instructions 1 124 may be transmitted or received over a etwork 1126 (e.g., network 590) via the network interface device 1 120.
10086] As used herein, the term "memory" refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memosy (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1122 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions (e.g., instructions 1 124). The term "machine-readable medium'" shall also be taken to include any medium that is capable of storing instructions (e.g., software) for execution by the machine, such that the instructions, when executed by one or more processors of the machine (e.g. , processor 1102), cause the machine to perform any one or more of the methodologies described herein. The term "machine-readable medium" shall accordingly be taken to include, but not be limited to, a data repository in the form of a solid- state memory, an optical medium, a magnetic medium, or any suitable combination thereof. [0087] Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operatioiis, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example co figurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
[0088] Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine -readable medium or in a transmission signal) or hardware modules. A "hardware module" is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
[ΘΘ89] In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may- include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general -purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations. [0090] Accordingly, the term "hardware module" should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, "hardware-implemented module" refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general -purpose processor configured by software to become a special- purpose processor, the general-purpose processor may be configured as respectively different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
10091] Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled, A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output.
Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
[0092] The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or
permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, "processor-implemented module" refers to a hardware module implemented using one or more processors. [0093] Similarly, the methods described herein may be at least partially processor- implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a "cloud computing" environment or as a "software as a service" (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
[0094] The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor- implemented modules may be located in a single geographic location (e.g., within a home environment, an office eiivironment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0095] Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an "algorithm" is a self- consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities.
Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as "data," "content," "bits," "values," "elements," "symbols," "characters," "terms," "numbers," "numerals," or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
[0096] Unless specifically stated otherwise, discussions herein using words such as "processing," "computing," "calculating," "determining," "presenting," "displaying," or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms "a" or "an" are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction "or" refers to a non-exclusive "or," unless specifically stated otherwise.
[ΘΘ97] The following enumerated descriptions define various example embodiments of methods and systems (e.g., apparatus) discussed herein:
[0098] i , A method comprising:
accessing a set of images of an item and a descriptor of the item, the set of images and the descriptor of the item being provided from a seller device corresponding to a seller of the item, the item being a specimen of a product having a three-dimensional (3D) shape;
identifying a 3D model of the product based on the descriptor of the item, the 3D model of the product including data that is representative of the 3D shape; and
generating a 3D model of the item based on the identified 3D model of the product and based on the set of images, the generating of the 3D model of the item being performed using a processor of a machine.
[0099] 2, T e met od of description 1 further comprising:
receiving the 3D model of the product; and
storing the 3D model of the product in a product database; wherein
the identifying of the 3D model includes accessing the 3D model of the product within the product database.
[00100] 3. The method of description 2, wherein:
the receiving of the 3D model of the product is from a manufacturer of the product,
[0100] 4. The method of any of descriptions 1 -3 further comprising:
receiving a descriptor of the product that corresponds to the descriptor of the item; and storing the descriptor of the product in a product database; wherein
the identifying of the 3D model of the product includes accessing the descriptor of the product within the product database.
[0101] 5. The method of description 4, wherein:
the receiving of the descriptor of the product is from a manufacturer of the product.
11 [0102] 6, The method of description 4 or description 5, wherein:
the descriptor of the item includes at least one of a manufacturer name of the product, a model name of the product, a model year of the product, the descriptor of the product, an abbreviation of the descriptor of the product, a variation of the descriptor of the product, a nickname of the descriptor of the product, a misspelling of the descriptor of the product, or a code specifying the descriptor of the product.
[0103] 7. The method of any of descriptions 1-6, wherein:
the generating of the 3D model of the item includes identifying an unusable image within a set of images; and removing the unusable image from the set of images.
[0104] 8. The method of any of descriptions 1 -7, wherein:
the generating of the 3D model of the item includes at least one of detecting an edge of the item depicted in an image from the set of images, segmenting the image into a foreground portion that depicts the item and a background portion from which the item is absent, or removing the background portion from the image.
[0105] 9. The method of any of descriptions 1-8, wherein:
the generating of the 3D model of the item includes identifying a portion of the 3D model of the product to be texture mapped with an image from the set of images.
[0106] 10. The method of any of descriptions 1-9, wherein:
the generating of the 3D model of the item includes identifying two or more images from the set of images that intersect in an overlapping region when texture mapped onto the 3D model of the product.
[0107] 11 . The method of any of descriptions 1 - 10, wherei :
the generating of the 3D model of the item includes texture mapping at least some of the set of images onto the 3D model of the product.
[0108] 12. The method of any of descriptions 1-11 further comprising:
storing the 3D model of the item in an item database; and
generating a model viewer that includes the 3D model of the item, the model viewer being configured to perform at least one of a rotation of the 3D model of the item, a zoom of the 3D model of the item, or a pan of the 3D model of the item. [0109] 13 , The method of description 12 further comprising:
providing a user application to a user device corresponding to a user of a network-based commerce system, the user application being configured to present the model viewer on the user device,
[0110] 14. The method of description 12 or description 13 further comprising:
providing the model viewer within a web page to a user device corresponding to a user of a network-based commerce system.
[0111] 15. The method of description 14 further comprising:
receiving a request from the user device, the request being for information regarding the item; and
providing the web page to the user device in response to the receiving of the request.
[0112] 16. The method of description 15 further comprising:
providing to the user device a further web page from which the 3D model of the item is absent, the further web page including the descriptor of the item and a submission control operable to submit the request; and wherein
the receiving of the request is resultant from operation of the submission control.
[0113] 17. The method of any of descriptions 1-16 further comprising:
receiving the set of images and the descriptor of the item from the seller device.
[0114] 18. The method of description 17 further comprising:
providing a seller application to the seller device, the seller application being configured to communicate the set of images and the d escriptor of the item from the seller device to a network-based commerce system.
[0115] 19. The method of description 17 or description 18, wherein:
the seller device includes a camera; and
the seller application is configured to generate the set of images using the camera. [0116] 20, A non- transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a set of images of an item and a descriptor of the item, the set of images and the descriptor of the item being provided from a seller device corresponding to a seller of the item, the item being a specimen of a product having a three-dimensional (3D) shape;
identifying a 3D model of the product based on the descriptor of the item, the 3D mode! of the product including data that is representative of the 3D shape; and
generating a 3D model of the item based on the identified 3D model of the product and based on the set of images.
[0117] 21. A system comprising:
an access module configured to access a set of images of an item and a descriptor of the item, the set of images and the descriptor of the item being provided from a seller device
corresponding to a seller of the item, the item being a specimen of a product having a three- dimensional (3D) shape;
an identification module configured to identify a 3D model of the product based on the descriptor of the item, the 3D model of the product including data that is representative of the 3D shape; and
a generation module configured to generate a 3D model of the item based on the identified 3D model of the product and based on the set of images, the generation module being implemented using a processor of a machine.
[0118] 22. A system comprising:
means for accessing a set of images of an item and a descriptor of the item, the set of images and the descriptor of the item being provided from a seller device corresponding to a seller of the item, the item being a specimen of a product having a three-dimensional (3D) shape;
means for identifying a 3D model of the product based on the descriptor of the item, the 3D model of the product including data that is representative of the 3D shape; and
means for generating a 3D model of the item based on the identified 3D model of the product and based on the set of images.

Claims

CLAIMS What is claimed is:
1. A method comprising:
accessing a set of images of an item and a descriptor of the item,
the set of images and the descriptor of the item being provid ed from a seller device corresponding to a seller of the tem,
the item being a specimen of a product having a three-dimensional (3D) shape; identifying a 3D model of the product based on the descriptor of the item,
the 3D model of the product including data that is representative of the 3D shape; and
generating a 3D model of the item based on the identified 3D model of the product and based on the set of images,
the generating of the 3D model of the item being performed using a processor of a machine.
2. The method of claim 1 further comprising:
receiving the 3D model of the product; and
storing the 3D model of the product in a product database; wherein
the identifying of the 3D model includes accessing the 3D model of the product within the product database.
3. The method of claim 2, wherein:
the receiving of the 3D model of the product is from a manufacturer of the product.
4, The method of claim 1 further comprising:
receiving a descriptor of the product that corresponds to the descriptor of the item; and storing the descriptor of the product in a product database; wherein
the identifying of the 3D model of the product includes accessing the descriptor of the product within the product database.
5. The method of claim 4, wherein:
the receiving of the descriptor of the product is from a manufacturer of the product.
6. The method of claim 4, wherein:
the descriptor of the item includes at least one of
a manufacturer name of the product,
a model name of the product,
a model year of the product,
the descriptor of the product,
an abbreviation of the descriptor of the product,
a variation of the descriptor of the product,
a nickname of the descriptor of the product,
a misspelling of the descriptor of the product, or
a code specifying the descriptor of the product.
The method of claim 1 , wherein:
the generating of the 3D model of the item includes
identifying an unusable image within a set of images; and
removing the unusable image from the set of images.
8, The method of claim 1, wherein:
the generating of the 3D model of the item includes at least one of
detecting an edge of the item depicted in an image from the set of images, segmenting the image into a foreground portion that depicts the item and a
background portion from which the item is absent, or removing the background portion from the image.
9. The method of claim 1, wherein:
the generating of the 3D model of the item mcludes identifying a portion of the 3D
model of the product to be texture mapped with an image from the set of images.
10. The method of claim 1 , wherein:
the generating of the 3D model of the item includes identifying two or more images from the set of images that intersect in an overlapping region when texture mapped onto the 3D model of the product.
1 1, The method of claim 1 , wherein:
the generating of the 3D model of the item includes textare mapping at least some of the set of images onto the 3D model of the product.
12, The method of claim 1 farther comprising:
storing the 3D model of the item in an item database; and
generating a model viewer that includes the 3D model of the item,
the model viewer being configured to perform at least one of
a rotation of the 3D model of the item,
a zoom of the 3D model of the item, or
a pan of the 3D model of the item.
13, The method of claim 12 farther comprising:
providing a user application to a user device corresponding to a user of a network-based commerce system,
the user application being configured to present the model viewer on the user device.
14. The method of claim 12 further comprising:
providing the model viewer within a web page to a user device corresponding to a user of a network-based commerce system.
15. The method of claim 14 farther comprising:
receiving a request from, the user device,
the request being for information regarding the item; and
providing the web page to the user device in response to the receiving of the request.
16. The method of claim 15 farther comprising:
providing to the user device a further web page from which the 3D model of the item is absent,
the further web page mcluding the descriptor of the item and a submission control operable to submit the request: and wherein
the receiving of the request is resultant from operation of the submission control
17. The method of claim 1 further comprising:
receiving the set of images and the descriptor of the item from the seller device.
1 8. The method of claim 17 further comprising:
providing a seller application to the seller device,
the seller application being configured to communicate the set of images and the descriptor of the item from the seller device to a network-based commerce system.
19. The method of claim 17, wherein:
the seller device includes a camera; and
the seller application is configured to generate the set of images using the camera.
20, A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a set of images of an item and a descriptor of the item,
the set of images and the descriptor of the item being provid ed from a seller device corresponding to a seller of the item,
the item being a specimen of a product having a three-dimensional (3D) shape; identifying a 3D model of the product based on the descriptor of the item,
the 3D model of the product including data that is representative of the 3D shape; and
generating a 3D model of the item based on the identified 3D model of the product and based on the set of images.
21. A system comprising:
an access module configured to access a set of images of an item and a descriptor of the item,
the set of images and the descriptor of the item being provided from a seller de vice corresponding to a seller of the item,
the item being a specimen of a product having a three-dimensional (3D) shape; an identification module configured to identify a 3D model of the product based on the descriptor of the item,
the 3D model of the product including data that is representative of the 3D shape; and
a generation module configured to generate a 3D model of the item based on the
identified 3D model of the product and based on the set of images, the generation module being implemented using a processor of a machine. , A system comprising:
means for accessing a set of images of an item and a descriptor of the item,
the set of images and the descriptor of the item being provided from a seller device corresponding to a seller of the item,
the item being a specimen of a product having a three -dimensional (3D) shape; means for identifying a 3D model of the product based on the descriptor of the item, the 3D model of the product including data that is representative of the 3D shapi and
means for generating a 3D model of the item based on the identified 3D model of the product and based on the set of images.
PCT/US2012/028785 2011-04-07 2012-03-12 Item model based on descriptor and images WO2012138452A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CA2832227A CA2832227C (en) 2011-04-07 2012-03-12 Item model based on descriptor and images
CN201280022041.XA CN103548051B (en) 2011-04-07 2012-03-12 Descriptor and image based item model
KR1020127032750A KR101420041B1 (en) 2011-04-07 2012-03-12 Item model based on descriptor and images
EP12767928.0A EP2695130A4 (en) 2011-04-07 2012-03-12 Item model based on descriptor and images
AU2012240539A AU2012240539B2 (en) 2011-04-07 2012-03-12 Item model based on descriptor and images
CN201911127688.9A CN110942370B (en) 2011-04-07 2012-03-12 Descriptor and image based project model

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/082,110 2011-04-07
US13/082,110 US8473362B2 (en) 2011-04-07 2011-04-07 Item model based on descriptor and images

Publications (1)

Publication Number Publication Date
WO2012138452A1 true WO2012138452A1 (en) 2012-10-11

Family

ID=46966840

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/028785 WO2012138452A1 (en) 2011-04-07 2012-03-12 Item model based on descriptor and images

Country Status (7)

Country Link
US (4) US8473362B2 (en)
EP (1) EP2695130A4 (en)
KR (1) KR101420041B1 (en)
CN (2) CN110942370B (en)
AU (1) AU2012240539B2 (en)
CA (1) CA2832227C (en)
WO (1) WO2012138452A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8473362B2 (en) 2011-04-07 2013-06-25 Ebay Inc. Item model based on descriptor and images

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8314790B1 (en) * 2011-03-29 2012-11-20 Google Inc. Layer opacity adjustment for a three-dimensional object
US9183672B1 (en) * 2011-11-11 2015-11-10 Google Inc. Embeddable three-dimensional (3D) image viewer
US20150206099A1 (en) * 2012-07-11 2015-07-23 Tyco Electronics Raychem Bvba Integrated three dimensional product access and display system
US9552598B2 (en) 2012-10-12 2017-01-24 Ebay Inc. Mobile trigger web workflow
US9374517B2 (en) 2012-10-12 2016-06-21 Ebay Inc. Guided photography and video on a mobile device
US10049429B2 (en) 2013-07-09 2018-08-14 Jung Ha RYU Device and method for designing using symbolized image, and device and method for analyzing design target to which symbolized image is applied
US9229674B2 (en) 2014-01-31 2016-01-05 Ebay Inc. 3D printing: marketplace with federated access to printers
US9827714B1 (en) 2014-05-16 2017-11-28 Google Llc Method and system for 3-D printing of 3-D object models in interactive content items
US9684440B2 (en) * 2014-06-30 2017-06-20 Apple Inc. Progressive rotational view
EP3192237A4 (en) 2014-09-10 2018-07-25 Hasbro, Inc. Toy system with manually operated scanner
US10445798B2 (en) * 2014-09-12 2019-10-15 Onu, Llc Systems and computer-readable medium for configurable online 3D catalog
KR101642107B1 (en) * 2014-10-29 2016-07-22 한국생산기술연구원 Manufacturing support system based on knowledge-intensive digital model
US20160167307A1 (en) * 2014-12-16 2016-06-16 Ebay Inc. Systems and methods for 3d digital printing
US9595037B2 (en) 2014-12-16 2017-03-14 Ebay Inc. Digital rights and integrity management in three-dimensional (3D) printing
US20170193644A1 (en) * 2015-12-30 2017-07-06 Ebay Inc Background removal

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080126021A1 (en) * 2006-11-27 2008-05-29 Ramsay Hoguet Converting web content into texture mapping objects
US20080211809A1 (en) * 2007-02-16 2008-09-04 Samsung Electronics Co., Ltd. Method, medium, and system with 3 dimensional object modeling using multiple view points
KR20080083843A (en) * 2007-03-13 2008-09-19 삼성전자주식회사 Apparatus and method for generating image to generate 3d image
KR20090004326A (en) * 2006-12-04 2009-01-12 한국전자통신연구원 Shopping assistance service method and system by using a 3d rendering technique
KR20090048979A (en) * 2007-11-12 2009-05-15 에스케이 텔레콤주식회사 Method and apparatus for providing advertizement service in communication network
US20100284607A1 (en) * 2007-06-29 2010-11-11 Three Pixels Wide Pty Ltd Method and system for generating a 3d model from images
US8848399B2 (en) 2010-08-18 2014-09-30 Finsix Corporation Very high frequency switching resonant synchronous rectification

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5255352A (en) 1989-08-03 1993-10-19 Computer Design, Inc. Mapping of two-dimensional surface detail on three-dimensional surfaces
WO1995015533A1 (en) * 1993-11-30 1995-06-08 Burke Raymond R Computer system for allowing a consumer to purchase packaged goods at home
US5903270A (en) 1997-04-15 1999-05-11 Modacad, Inc. Method and apparatus for mapping a two-dimensional texture onto a three-dimensional surface
GB9626825D0 (en) 1996-12-24 1997-02-12 Crampton Stephen J Avatar kiosk
WO1998058351A1 (en) 1997-06-17 1998-12-23 British Telecommunications Public Limited Company Generating an image of a three-dimensional object
US6157747A (en) 1997-08-01 2000-12-05 Microsoft Corporation 3-dimensional image rotation method and apparatus for producing image mosaics
US6677944B1 (en) 1998-04-14 2004-01-13 Shima Seiki Manufacturing Limited Three-dimensional image generating apparatus that creates a three-dimensional model from a two-dimensional image by image processing
US6999073B1 (en) * 1998-07-20 2006-02-14 Geometrix, Inc. Method and system for generating fully-textured 3D
JP3954211B2 (en) * 1998-08-20 2007-08-08 富士通株式会社 Method and apparatus for restoring shape and pattern in 3D scene
US6278460B1 (en) 1998-12-15 2001-08-21 Point Cloud, Inc. Creating a three-dimensional model from two-dimensional images
US8266040B2 (en) 2000-01-31 2012-09-11 New York Stock Exchange Llc Virtual trading floor system and method
US7053906B2 (en) 2000-03-08 2006-05-30 Sony Computer Entertainment Inc. Texture mapping method, recording medium, program, and program executing apparatus
US7728848B2 (en) 2000-03-28 2010-06-01 DG FastChannel, Inc. Tools for 3D mesh and texture manipulation
JP4474743B2 (en) 2000-07-03 2010-06-09 ソニー株式会社 3D image generating apparatus, 3D image generating method, and program recording medium
US20020072993A1 (en) * 2000-11-03 2002-06-13 Sandus James A. Method and system of an integrated business topography and virtual 3D network portal
JP2002236941A (en) * 2001-02-09 2002-08-23 Minolta Co Ltd Electronic catalog system, and server, program and recording medium used in electronic catalog system
US6985145B2 (en) 2001-11-09 2006-01-10 Nextengine, Inc. Graphical interface for manipulation of 3D models
KR100477801B1 (en) * 2002-12-26 2005-03-22 한국전자통신연구원 Apparatus and Method of 3-Dimensional Image Data Description and Apparatus and Method of 3-Dimensional Image Data search
US8126907B2 (en) 2004-08-03 2012-02-28 Nextengine, Inc. Commercial shape search engine
EP1877982A1 (en) * 2005-04-25 2008-01-16 Yappa Corporation 3d image generation and display system
US7487116B2 (en) * 2005-12-01 2009-02-03 International Business Machines Corporation Consumer representation rendering with selected merchandise
JP4894369B2 (en) * 2006-06-19 2012-03-14 富士通株式会社 3D model image processing device
US20080043013A1 (en) * 2006-06-19 2008-02-21 Kimberly-Clark Worldwide, Inc System for designing shopping environments
US7656402B2 (en) * 2006-11-15 2010-02-02 Tahg, Llc Method for creating, manufacturing, and distributing three-dimensional models
US8253731B2 (en) * 2006-11-27 2012-08-28 Designin Corporation Systems, methods, and computer program products for home and landscape design
CN100430690C (en) * 2006-12-19 2008-11-05 南京航空航天大学 Method for making three-dimensional measurement of objects utilizing single digital camera to freely shoot
JP4990173B2 (en) * 2008-01-28 2012-08-01 株式会社リコー Image processing apparatus, image processing method, and program
US8125481B2 (en) * 2008-03-21 2012-02-28 Google Inc. Lightweight three-dimensional display
US8243334B2 (en) * 2008-06-06 2012-08-14 Virginia Venture Industries, Llc Methods and apparatuses for printing three dimensional images
CN101686335A (en) * 2008-09-28 2010-03-31 新奥特(北京)视频技术有限公司 Method and device for acquiring three-dimensional image model
US20110218825A1 (en) * 2010-03-03 2011-09-08 International Business Machines Corporation Three-dimensional interactive vehicle damage claim interface
US8570343B2 (en) * 2010-04-20 2013-10-29 Dassault Systemes Automatic generation of 3D models from packaged goods product images
US8463026B2 (en) * 2010-12-22 2013-06-11 Microsoft Corporation Automated identification of image outliers
US8473362B2 (en) 2011-04-07 2013-06-25 Ebay Inc. Item model based on descriptor and images
AU2015249179B2 (en) 2011-04-07 2016-10-27 Ebay Inc. Item model based on descriptor and images

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080126021A1 (en) * 2006-11-27 2008-05-29 Ramsay Hoguet Converting web content into texture mapping objects
KR20090004326A (en) * 2006-12-04 2009-01-12 한국전자통신연구원 Shopping assistance service method and system by using a 3d rendering technique
US20080211809A1 (en) * 2007-02-16 2008-09-04 Samsung Electronics Co., Ltd. Method, medium, and system with 3 dimensional object modeling using multiple view points
KR20080083843A (en) * 2007-03-13 2008-09-19 삼성전자주식회사 Apparatus and method for generating image to generate 3d image
US20100284607A1 (en) * 2007-06-29 2010-11-11 Three Pixels Wide Pty Ltd Method and system for generating a 3d model from images
KR20090048979A (en) * 2007-11-12 2009-05-15 에스케이 텔레콤주식회사 Method and apparatus for providing advertizement service in communication network
US8848399B2 (en) 2010-08-18 2014-09-30 Finsix Corporation Very high frequency switching resonant synchronous rectification

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2695130A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8473362B2 (en) 2011-04-07 2013-06-25 Ebay Inc. Item model based on descriptor and images
US9805503B2 (en) 2011-04-07 2017-10-31 Ebay Inc. Item model based on descriptor and images
US10157496B2 (en) 2011-04-07 2018-12-18 Ebay Inc. Item model based on descriptor and images
US11004260B2 (en) 2011-04-07 2021-05-11 Ebay Inc. Item model based on descriptor and images

Also Published As

Publication number Publication date
CA2832227C (en) 2016-12-06
CA2832227A1 (en) 2012-10-11
US20120259738A1 (en) 2012-10-11
CN110942370A (en) 2020-03-31
US11004260B2 (en) 2021-05-11
EP2695130A1 (en) 2014-02-12
US9805503B2 (en) 2017-10-31
KR101420041B1 (en) 2014-07-15
CN103548051B (en) 2019-12-17
CN103548051A (en) 2014-01-29
US20180040157A1 (en) 2018-02-08
US8473362B2 (en) 2013-06-25
AU2012240539B2 (en) 2015-07-30
US20130257868A1 (en) 2013-10-03
EP2695130A4 (en) 2015-02-11
US10157496B2 (en) 2018-12-18
CN110942370B (en) 2023-05-12
US20190088010A1 (en) 2019-03-21
AU2012240539A1 (en) 2013-10-31
KR20130010027A (en) 2013-01-24

Similar Documents

Publication Publication Date Title
US11004260B2 (en) Item model based on descriptor and images
US10628877B2 (en) System and method for visualization of items in an environment using augmented reality
US11768870B2 (en) Identifying product metadata from an item image
US11507970B2 (en) Dynamically generating a reduced item price
US11182846B2 (en) Providing an image of an item to advertise the item
AU2015249179B2 (en) Item model based on descriptor and images
US20160189219A1 (en) Simplified overlay ads

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12767928

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20127032750

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2832227

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2012767928

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2012240539

Country of ref document: AU

Date of ref document: 20120312

Kind code of ref document: A