WO2005089265A2 - Personalized prototyping - Google Patents

Personalized prototyping Download PDF

Info

Publication number
WO2005089265A2
WO2005089265A2 PCT/US2005/008407 US2005008407W WO2005089265A2 WO 2005089265 A2 WO2005089265 A2 WO 2005089265A2 US 2005008407 W US2005008407 W US 2005008407W WO 2005089265 A2 WO2005089265 A2 WO 2005089265A2
Authority
WO
WIPO (PCT)
Prior art keywords
data set
input data
representing
item
dimensional
Prior art date
Application number
PCT/US2005/008407
Other languages
French (fr)
Other versions
WO2005089265A3 (en
Inventor
Bran Ferren
Edward K. Y. Jung
Clarence T. Tegreene
Original Assignee
Searete Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/802,106 external-priority patent/US7806339B2/en
Priority claimed from US10/884,760 external-priority patent/US20060004476A1/en
Priority claimed from US10/892,974 external-priority patent/US10215562B2/en
Priority claimed from US10/892,755 external-priority patent/US20060012081A1/en
Application filed by Searete Llc filed Critical Searete Llc
Publication of WO2005089265A2 publication Critical patent/WO2005089265A2/en
Publication of WO2005089265A3 publication Critical patent/WO2005089265A3/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/4202Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model
    • G05B19/4205Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine preparation of the programme medium using a drawing, a model in which a drawing is traced or scanned and corresponding data recorded
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing

Definitions

  • a method for producing an item data set representing the three-dimensional configuration of an item includes accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object; adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set.
  • a method for producing an item data set representing the three-dimensional configuration of an item includes providing a first input data set from a first provider, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting the first input data set; providing a second input data set from a second provider, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object; accepting the second input data set; adapting the first data input set and the second data input set to be combined to obtain an item data set representing at least a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set.
  • FIG. 1 is a flow chart depicting an embodiment of the subject matter of this application;
  • FIG. 2 is a flow chart depicting another embodiment of the subject matter of this application;
  • FIG. 3 is a flow chart depicting another embodiment;
  • FIG. 4 is a flow chart depicting another embodiment;
  • FIG. 5 is a flow chart depicting another embodiment;
  • FIG. 6 is a flow chart depicting another embodiment;
  • FIG. 7 is a flow chart depicting another embodiment;
  • FIG. 1 is a flow chart depicting an embodiment of the subject matter of this application;
  • FIG. 2 is a flow chart depicting another embodiment of the subject matter of this application;
  • FIG. 3 is a flow chart depicting another embodiment;
  • FIG. 4 is a flow chart depicting another embodiment;
  • FIG. 5 is a flow chart depicting another embodiment;
  • FIG. 6 is a flow chart depicting another embodiment;
  • FIG. 7 is a flow chart depicting another embodiment;
  • FIG. 1 is a flow chart depicting an embodiment of the subject matter
  • FIG. 8 is a flow chart depicting another embodiment;
  • FIG. 9 is a flow chart depicting another embodiment;
  • FIG. 10 is a flow chart depicting another embodiment;
  • FIG. 11 is a flow chart depicting another embodiment;
  • FIG. 12 is a flow chart depicting another embodiment;
  • FIG. 13 is a flow chart depicting another embodiment;
  • FIG. 14 is a flow chart depicting another embodiment;
  • FIG. 15 is an isometric view of another embodiment; and
  • FIG. 16 is an isometric view of another embodiment.
  • FIG. 1 shows an embodiment of the subject matter of this application, a method for producing an item data set representing the three-dimensional configuration of an item, the method including accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object (step 100); accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object (step 102); adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item (step 104); and combining the first input data set and the second input data set to obtain the item data set (step 106).
  • Steps 100 and 102 may be performed in any order or simultaneously.
  • another embodiment includes steps 100, 102, 104 and 106, and further includes producing the item using the item data set (step 108).
  • This embodiment is not limited to any particular manufacturing process; the manufacturing processes include those generally known as rapid prototyping.
  • another embodiment includes steps 100, 102, 104 and 106, and the first three-dimensional object includes at least one portion of a human figure.
  • Still another embodiment includes steps 100, 102, 104 and 106, and further includes deriving the first input data set from at least one captured image of at least one portion of the human figure (step 109).
  • the captured image or images may be a photograph or photographs of any type, e.g., a digital photograph.
  • FIG. 4 includes steps 100, 102, 104 and 106, and further includes deriving the first input data set from a plurality of captured images, each captured image including information representing a respective region of at least one portion of the human figure (step 110).
  • These captured images may be photographs of any type, e.g., digital photographs.
  • FIG. 5 shows an embodiment in which the first input data set may be accepted from a memory, e.g., a non-volatile memory.
  • This embodiment may further include presenting a port adapted to receive the non-volatile memory (step 112) and, in addition, receiving the non- volatile memory (step 114).
  • the first input data set may be accepted by way of a communications link, e.g. a wireless communications link.
  • step 100 may be initiated when a source of the first input data set (e.g., a laptop computer) is placed within operational range of a wireless communications port operably coupled to the wireless communications link, e.g., a wireless "hot-spot" such as those increasingly common in facilities of all types, e.g., airports and restaurants (step 116).
  • a source of the first input data set e.g., a laptop computer
  • a wireless communications port operably coupled to the wireless communications link
  • FIG. 7 illustrates another embodiment in which the first input data set is received via the Internet (step 118) prior to the acceptance of the first input data set (step 100).
  • the first input data set is accepted via the Internet.
  • FIG. 7 illustrates another embodiment in which the first input data set is received via the Internet (step 118) prior to the acceptance of the first input data set (step 100).
  • the first input data set is accepted via the Internet. Another embodiment, shown in FIG.
  • steps 100, 102, 106 and 108 further includes presenting a set of pre-selected objects to a user (step 120), and in this embodiment, step 100 includes accepting a user selection from the set of pre-selected objects.
  • presenting a set of pre-selected objects may include an interactive process with a user.
  • a user interface may present a set of categories from which the user may elect one or more.
  • the user interface may then present a set of visual images, e.g., thumbnails or full-size images from which the user can select objects.
  • the first data directly represents the first three-dimensional configuration, e.g., the first data includes spatial coordinates of the contours of the first three-dimensional configuration.
  • the first data includes at least one parameter representing at least one attribute of the first three-dimensional configuration, and further including deriving the first three dimensional configuration from the at least one attribute.
  • a parameter may be a numerical representation of an attribute, where an attribute may be a dimension, a feature, or, more generally, some describable aspect of a three- dimensional object.
  • the first data may include at least a datum representing a storage location, e.g., in a memory, for such a parameter, and an embodiment further includes retrieving the at least one parameter from the storage location using the at least one datum (step 122) and deriving the first three-dimensional configuration from the at least one attribute (step 124).
  • one embodiment includes steps 100, 102, 104 and 106, and further includes incorporating at least a portion of the first input data set into a computer model (step 126).
  • a "computer model” is a collection of attributes as discussed above, storable in an electronic library, memory, database, and including or capable of generating any interpolations required to make mutually compatible the attributes included in the computer model.
  • steps 100, 102, 104 and 106 may include one or more steps and features, described here for illustrative purposes in detail in conjunction with first input data set or the first data, with regard to one or both of the second input data set or the second data.
  • FIG. 11 illustrates another embodiment including steps 100, 102, 106 and 108 which further includes accepting an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier (step 128).
  • Embedded identifiers are discussed in U.S. Patent Application 10/802,106, filed March 16, 2004. As shown in FIG.
  • steps 100, 102, 106 and 108 further includes forming an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier (step 130).
  • steps 100, 102, 106 and 108 further includes forming from first identification data and second identification data an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier, where the first input data set includes the first identification data and the second input data set includes the second identification data (step 132).
  • FIG. 14 illustrates one embodiment, a method for producing an item data set representing the three-dimensional configuration of an item, which includes providing a first input data set from a first provider, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object (step 134); accepting the first input data set (step 136); providing a second input data set from a second provider, the second input data set including second data representing at least a second three-dimensional configuration of a second three- dimensional object (step 138); accepting the second input data set (step 140); adapting the first data input set and the second data input set to be combined to obtain an item data set representing at least a three-dimensional configuration of the item (step 142); and combining the first input data set and the second input data set to obtain the item data set (step 144).
  • Steps 134 and 136, taken together, may be performed before, after, simultaneously with, or partially simultaneously with steps 138 and 140 taken together.
  • the first provider and the second provider may be the same person or entity, and the either or both of the first provider and the second provider may be a customer of a business using the method or making the method available for use by a customer.
  • the first provider may be a customer who may be located remotely from the business, may be at the location of the business, or may be located at a designated input location, such as a kiosk.
  • a user interface may present to the user a representation of the three-dimensional configuration of the item and provide a set of options to the user. For example, the user may approve of the item or may elect to modify one or more portions of the item.
  • the user interface may then provide updated financial transaction and scheduling information to the user and may permit additional user input such as transaction approval.
  • steps on any of the parallel branches may be performed before, after, simultaneously with, or partially simultaneously with the steps on the other branch or branches.
  • steps 134, 136, 138, 140, 142 and 144 may include one or more of the steps and features discussed above in conjunction with steps 100, 102, 104 and 106.
  • FIG. 1 Another embodiment, shown in FIG.
  • first feature 146 shaped according to a first data set
  • first data set including at least three-dimensional configuration data for the at least one first feature and at least one second feature 148 shaped according to a second data set
  • second data set including at least three-dimensional configuration data for the at least one second feature
  • the first feature and the second feature are shaped using rapid prototyping.
  • FIG. 1 shows yet another embodiment, shown in FIG.
  • the first feature and the second feature are shaped using rapid prototyping.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Materials Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method for producing an item data set representing the three-dimensional configuration of an item includes accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object (100); accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object (102); adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item (104); and combining the first input data set and the second input data set to obtain the item data set (106). Other methods for producing item data sets and articles of manufacture are also disclosed.

Description

PERSONALIZED PROTOTYPING
Inventors: Bran Ferren Edward K.Y. Jung Clarence T. Tegreene CROSS-REFERENCE TO RELATED APPLICATIONS The present application is related to, claims the earliest available effective filing date(s) from (e.g., claims earliest available priority dates for other than provisional patent applications; claims benefits under 35 USC § 119(e) for provisional patent applications), and incorporates by reference in its entirety all subject matter of the following listed application(s); the present application also claims the earliest available effective filing date(s) from, and also incorporates by reference in its entirety all subject matter of any and all parent, grandparent, great-grandparent, etc. applications of the following listed application(s):
1. United States patent application entitled EMBEDDED I DENTIFI ERS, naming Bran Ferren, Edward K.Y. Jung and Clarence T. Tegreene as inventors, filed 16 March 2004 having U.S. Patent Application No. 10/802,106.
2. United States patent application entitled A SYSTEM FOR MAKING
CUSTOM PROTOTYPES, naming Edward K.Y. Jung, Bran Ferren and Clarence T. Tegreene as inventors, filed 2 July 2004.
3. United States patent application entitled CUSTOM PROTOTYPI NG, naming Edward K.Y. Jung, Bran Ferren and Clarence T. Tegreene as inventors, filed contemporaneously herewith. FIELD OF THE INVENTION The subject matter of this application generally relates to creating custom data sets and custom-made articles of manufacture.
SUMMARY OF THE INVENTION One embodiment of the subject matter of this application, a method for producing an item data set representing the three-dimensional configuration of an item, includes accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object; adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set. Another embodiment, a method for producing an item data set representing the three-dimensional configuration of an item, includes providing a first input data set from a first provider, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting the first input data set; providing a second input data set from a second provider, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object; accepting the second input data set; adapting the first data input set and the second data input set to be combined to obtain an item data set representing at least a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set. The first provider and the second provider may be the same person or entity, and the either or both of the first provider and the second provider may be a customer of a business using the method or making the method available for use by a customer. Other embodiments are described in the detailed description of the figures. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a flow chart depicting an embodiment of the subject matter of this application; FIG. 2 is a flow chart depicting another embodiment of the subject matter of this application; FIG. 3 is a flow chart depicting another embodiment; FIG. 4 is a flow chart depicting another embodiment; FIG. 5 is a flow chart depicting another embodiment; FIG. 6 is a flow chart depicting another embodiment; FIG. 7 is a flow chart depicting another embodiment; FIG. 8 is a flow chart depicting another embodiment; FIG. 9 is a flow chart depicting another embodiment; FIG. 10 is a flow chart depicting another embodiment; FIG. 11 is a flow chart depicting another embodiment; FIG. 12 is a flow chart depicting another embodiment; FIG. 13 is a flow chart depicting another embodiment; FIG. 14 is a flow chart depicting another embodiment; FIG. 15 is an isometric view of another embodiment; and FIG. 16 is an isometric view of another embodiment.
DETAILED DESCRIPTION OF THE FIGURES FIG. 1 shows an embodiment of the subject matter of this application, a method for producing an item data set representing the three-dimensional configuration of an item, the method including accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object (step 100); accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object (step 102); adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item (step 104); and combining the first input data set and the second input data set to obtain the item data set (step 106). Steps 100 and 102 may be performed in any order or simultaneously. As shown in FIG. 2, another embodiment includes steps 100, 102, 104 and 106, and further includes producing the item using the item data set (step 108). This embodiment is not limited to any particular manufacturing process; the manufacturing processes include those generally known as rapid prototyping. As shown in FIG. 3, another embodiment includes steps 100, 102, 104 and 106, and the first three-dimensional object includes at least one portion of a human figure. Still another embodiment includes steps 100, 102, 104 and 106, and further includes deriving the first input data set from at least one captured image of at least one portion of the human figure (step 109). The captured image or images may be a photograph or photographs of any type, e.g., a digital photograph. Yet another embodiment, illustrated in FIG. 4, includes steps 100, 102, 104 and 106, and further includes deriving the first input data set from a plurality of captured images, each captured image including information representing a respective region of at least one portion of the human figure (step 110). These captured images may be photographs of any type, e.g., digital photographs. FIG. 5 shows an embodiment in which the first input data set may be accepted from a memory, e.g., a non-volatile memory. This embodiment may further include presenting a port adapted to receive the non-volatile memory (step 112) and, in addition, receiving the non- volatile memory (step 114). In this embodiment, the first input data set may be accepted by way of a communications link, e.g. a wireless communications link. Where the first input data set is accepted by way of a wireless communications link, as shown in FIG. 6, step 100 may be initiated when a source of the first input data set (e.g., a laptop computer) is placed within operational range of a wireless communications port operably coupled to the wireless communications link, e.g., a wireless "hot-spot" such as those increasingly common in facilities of all types, e.g., airports and restaurants (step 116). FIG. 7 illustrates another embodiment in which the first input data set is received via the Internet (step 118) prior to the acceptance of the first input data set (step 100). In yet another embodiment, the first input data set is accepted via the Internet. Another embodiment, shown in FIG. 8, including steps 100, 102, 106 and 108 further includes presenting a set of pre-selected objects to a user (step 120), and in this embodiment, step 100 includes accepting a user selection from the set of pre-selected objects. In one approach, presenting a set of pre-selected objects may include an interactive process with a user. For example, a user interface may present a set of categories from which the user may elect one or more. The user interface may then present a set of visual images, e.g., thumbnails or full-size images from which the user can select objects.
In another embodiment, at least a portion of the first data directly represents the first three-dimensional configuration, e.g., the first data includes spatial coordinates of the contours of the first three-dimensional configuration. In yet another embodiment, illustrated in FIG. 9, the first data includes at least one parameter representing at least one attribute of the first three-dimensional configuration, and further including deriving the first three dimensional configuration from the at least one attribute. In this embodiment, a parameter may be a numerical representation of an attribute, where an attribute may be a dimension, a feature, or, more generally, some describable aspect of a three- dimensional object. The first data may include at least a datum representing a storage location, e.g., in a memory, for such a parameter, and an embodiment further includes retrieving the at least one parameter from the storage location using the at least one datum (step 122) and deriving the first three-dimensional configuration from the at least one attribute (step 124). As shown in FIG. 10, one embodiment includes steps 100, 102, 104 and 106, and further includes incorporating at least a portion of the first input data set into a computer model (step 126). Herein, a "computer model" is a collection of attributes as discussed above, storable in an electronic library, memory, database, and including or capable of generating any interpolations required to make mutually compatible the attributes included in the computer model. A variety of computer models are known to skilled artisans. Various embodiments including steps 100, 102, 104 and 106 may include one or more steps and features, described here for illustrative purposes in detail in conjunction with first input data set or the first data, with regard to one or both of the second input data set or the second data. FIG. 11 illustrates another embodiment including steps 100, 102, 106 and 108 which further includes accepting an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier (step 128). Embedded identifiers are discussed in U.S. Patent Application 10/802,106, filed March 16, 2004. As shown in FIG. 12, yet another embodiment including steps 100, 102, 106 and 108 further includes forming an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier (step 130). As illustrated in FIG. 13, still another embodiment including steps 100, 102, 106 and 108 further includes forming from first identification data and second identification data an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier, where the first input data set includes the first identification data and the second input data set includes the second identification data (step 132). FIG. 14 illustrates one embodiment, a method for producing an item data set representing the three-dimensional configuration of an item, which includes providing a first input data set from a first provider, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object (step 134); accepting the first input data set (step 136); providing a second input data set from a second provider, the second input data set including second data representing at least a second three-dimensional configuration of a second three- dimensional object (step 138); accepting the second input data set (step 140); adapting the first data input set and the second data input set to be combined to obtain an item data set representing at least a three-dimensional configuration of the item (step 142); and combining the first input data set and the second input data set to obtain the item data set (step 144). Steps 134 and 136, taken together, may be performed before, after, simultaneously with, or partially simultaneously with steps 138 and 140 taken together. The first provider and the second provider may be the same person or entity, and the either or both of the first provider and the second provider may be a customer of a business using the method or making the method available for use by a customer. In one example, the first provider may be a customer who may be located remotely from the business, may be at the location of the business, or may be located at a designated input location, such as a kiosk. In one alternative, a user interface may present to the user a representation of the three-dimensional configuration of the item and provide a set of options to the user. For example, the user may approve of the item or may elect to modify one or more portions of the item. If the user elects to modify the item, the user interface may then provide updated financial transaction and scheduling information to the user and may permit additional user input such as transaction approval. Generally, in the figures herein, where two or more parallel branches of a flow chart are shown, the steps on any of the parallel branches may be performed before, after, simultaneously with, or partially simultaneously with the steps on the other branch or branches. Various embodiments including steps 134, 136, 138, 140, 142 and 144 may include one or more of the steps and features discussed above in conjunction with steps 100, 102, 104 and 106. Another embodiment, shown in FIG. 15, is an article of manufacture including at least one first feature 146 shaped according to a first data set, the first data set including at least three-dimensional configuration data for the at least one first feature and at least one second feature 148 shaped according to a second data set, the second data set including at least three-dimensional configuration data for the at least one second feature, where one or both of the first data set and the second data set are adapted to be combined with each other to yield the at least one first feature and the at least one second feature as integral portions of the article of manufacture. In one embodiment, the first feature and the second feature are shaped using rapid prototyping. Yet another embodiment, shown in FIG. 16, is an article of manufacture including at least one first feature shaped 150 according to a first data set, the first data set being provided by a first provider and including at least a first three-dimensional configuration of the at least one first feature, and at least one second feature 152 shaped according to a second data set, the second data set being provided by a second provider and including at least a second three-dimensional configuration of the at least one second feature, where one or both of the first data set and the second data set are adapted to be combined with each other to yield the at least one first feature and the at least one second feature as integral portions of the article of manufacture. In one embodiment, the first feature and the second feature are shaped using rapid prototyping. One skilled in the art will recognize that the foregoing components (e.g., steps), devices, and objects in Figures 1-16 and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are common. Consequently, as used herein, the specific exemplars set forth in Figures 1- 16 and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar herein is also intended to be representative of its class, and the non-inclusion of such specific components (e.g., steps), devices, and objects herein should not be taken as indicating that limitation is desired. While particular embodiments of the subject matter of this application have been shown and described, it will be obvious to those skilled in the art that, based upon the teaching herein, changes and modifications may be made without departing from the subject matter and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of the subject matter. Furthermore, it is to be understood that the subject matter of this application is solely defined by the appended claims. Other embodiments are within the following claims.

Claims

We claim: 1. A method for producing an item data set representing the three-dimensional configuration of an item, the method comprising: accepting a first input data set, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting a second input data set, the second input data set including second data representing at least a second three-dimensional configuration of a second three- dimensional object; adapting one or both of the first data input set and the second data input set to be combined to obtain an item data set representing a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set.
2. A method for producing an item, the method comprising: the method of claim 1 and further comprising producing the item using the item data set.
3. The method of claim 2, wherein the producing the item uses rapid prototyping.
4. The method of claim 1, wherein the first three-dimensional object includes at least one portion of a human figure.
5. The method of claim 4, further comprising deriving the first input data set from at least one captured image of the at least one portion of the human figure.
6. The method of claim 5 wherein the at least one captured image is a photograph.
7. The method of claim 5 wherein the at least one captured image is a digital photograph.
8. The method of claim 5, further comprising deriving the first input data set from a plurality of captured images, each captured image including information representing a respective region of the at least one portion of the human figure.
9. The method of claim 1, wherein the first input data set is accepted from a memory.
10. The method of claim 9, wherein the first input data set is accepted from a nonvolatile memory.
11. The method of claim 10, further comprising presenting a port adapted to receive the non-volatile memory.
12. The method of claim 11 , further comprising receiving the non- volatile memory.
13. The method of claim 1, wherein the first input data set is accepted by way of a communications link.
14. The method of claim 13, wherein the communications link is a wireless communications link.
15. The method of claim 14, further comprising initiating the accepting the first input data set when a source of the first input data set is placed within operational range of a wireless communications port operably coupled to the wireless communications link.
16. The method of claim 1, wherein the at least a portion of the first data directly represents the first three-dimensional configuration.
17. The method of claim 1, wherein at least a portion of the first data includes at least one parameter representing at least one attribute of the first three-dimensional configuration and further comprising deriving at least a portion of the first three- dimensional configuration from the at least one attribute.
18. The method of claim 1, wherein at least a portion of the first data includes at least one datum representing a storage location for at least one parameter representing at least one attribute of the first three-dimensional configuration and further comprising retrieving the at least one parameter from the storage location using the at least one datum.
19. The method of claim 18, further comprising deriving at least a portion of the first three-dimensional configuration from the at least one attribute.
20. The method of claim 1, further comprising incorporating at least a portion of the first input data set into a computer model.
21. The method of claim 1 , wherein the first input data set is received via the Internet prior to the accepting the first input data set.
22. The method of claim 1 , wherein the first input data set is accepted via the Internet.
23. The method of claim 1, further comprising presenting a set of pre-selected objects to a user; and wherein accepting a first input data set includes accepting a user selection from the set of pre-selected objects.
24. The method of claim 1, further comprising accepting an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
25. The method of claim 1, further comprising forming an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
26. The method of claim 1, wherein the first input data set includes first identification data and the second input data set includes second identification data, the method further comprising forming from the first identification data and the second identification data an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
27. A method for producing an item data set representing the three-dimensional configuration of an item, the method comprising: providing a first input data set from a first provider, the first input data set including first data representing at least a first three-dimensional configuration of a first three-dimensional object; accepting the first input data set; providing a second input data set from a second provider, the second input data set including second data representing at least a second three-dimensional configuration of a second three-dimensional object; accepting the second input data set; adapting the first data input set and the second data input set to be combined to obtain an item data set representing at least a three-dimensional configuration of the item; and combining the first input data set and the second input data set to obtain the item data set.
28. A method for producing an item, the method comprising: the method of claim 27 and further comprising using rapid prototyping to produce the item using the item data set.
29. The method of claim 27, wherein the first three-dimensional object includes at least one portion of a human figure, and further comprising deriving the first input data set from a plurality of digital photographs, each digital photograph including information representing a respective region of the at least one portion of the human figure.
30. The method of claim 27, further comprising presenting a port adapted to receive the non- volatile memory and receiving the non-volatile memory, wherein the first input data set is accepted from a non- volatile memory.
31. The method of claim 27, further comprising initiating the accepting of the first input data set when a source of the first input data set is placed within operational range of a wireless communications port operably coupled to a wireless communications link, wherein the first input data set is accepted by way of the wireless communications link,.
32. The method of claim 27, wherein the at least a portion of the first data directly represents the first three-dimensional configuration.
33. The method of claim 27, wherein at least a portion of the first data includes at least one parameter representing at least one attribute of the first three-dimensional configuration and further comprising deriving at least a portion of the first three- dimensional configuration from the at least one attribute.
34. The method of claim 27, wherein at least a portion of the first data includes at least one datum representing a storage location for at least one parameter representing at least one attribute of the first three-dimensional configuration and further comprising retrieving the at least one parameter from the storage location using the at least one datum and deriving at least a portion of the first three-dimensional configuration from the at least one attribute.
35. The method of claim 27, further comprising incorporation the first input data set into a computer model.
36. The method of claim 27, wherein the first input data set is received via the Internet prior to the accepting the first input data set.
37. The method of claim 27, wherein the first input data set is accepted via the Internet.
38. The method of claim 27, further comprising presenting a set of pre-selected objects to a user; and wherein accepting a first input data set includes accepting a user selection from the set of pre-selected objects.
39. The method of claim 27, further comprising accepting an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
40. The method of claim 27, further comprising forming an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
41. The method of claim 27, wherein the first input data set includes first identification data and the second input data set includes second identification data, the method further comprising forming from the first identification data and the second identification data an identifier input data set including data representing at least one three-dimensional identifier configuration corresponding to an embedded identifier.
42. An article of manufacture comprising: at least one first feature shaped according to a first data set, the first data set including at least three-dimensional configuration data for the at least one first feature; and at least one second feature shaped according to a second data set, the second data set including at least three-dimensional configuration data for the at least one second feature; wherein one or both of the first data set and the second data set are adapted to be combined with each other to yield the at least one first feature and the at least one second feature as integral portions of the article of manufacture.
43. The article of manufacture of claim 42, wherein the first feature and the second feature are shaped using rapid prototyping.
44. An article of manufacture comprising: at least one first feature shaped according to a first data set, the first data set being provided by a first provider and including at least a first three-dimensional configuration of the at least one first feature; and at least one second feature shaped according to a second data set, the second data set being provided by a second provider and including at least a second three-dimensional configuration of the at least one second feature; wherein one or both of the first data set and the second data set are adapted to be combined with each other to yield the at least one first feature and the at least one second feature as integral portions of the article of manufacture.
45. The article of manufacture of claim 44, wherein the first feature and the second feature are shaped using rapid prototyping.
PCT/US2005/008407 2004-03-16 2005-03-14 Personalized prototyping WO2005089265A2 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US10/802,106 2004-03-16
US10/802,106 US7806339B2 (en) 2004-03-16 2004-03-16 Embedded identifiers
US10/884,760 2004-07-02
US10/884,760 US20060004476A1 (en) 2004-07-02 2004-07-02 System for making custom prototypes
US10/892,974 US10215562B2 (en) 2004-07-16 2004-07-16 Personalized prototyping
US10/892,974 2004-07-16
US10/892,755 US20060012081A1 (en) 2004-07-16 2004-07-16 Custom prototyping
US10/892,755 2004-07-16

Publications (2)

Publication Number Publication Date
WO2005089265A2 true WO2005089265A2 (en) 2005-09-29
WO2005089265A3 WO2005089265A3 (en) 2009-04-16

Family

ID=34994229

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2005/008531 WO2005089307A2 (en) 2004-03-16 2005-03-14 Custom prototyping
PCT/US2005/008407 WO2005089265A2 (en) 2004-03-16 2005-03-14 Personalized prototyping

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2005/008531 WO2005089307A2 (en) 2004-03-16 2005-03-14 Custom prototyping

Country Status (1)

Country Link
WO (2) WO2005089307A2 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020012454A1 (en) * 2000-03-09 2002-01-31 Zicheng Liu Rapid computer modeling of faces for animation
US6405095B1 (en) * 1999-05-25 2002-06-11 Nanotek Instruments, Inc. Rapid prototyping and tooling system
US6623681B1 (en) * 1997-05-20 2003-09-23 Toray Industries, Inc. Polyester fiber and process for preparing the same
US6623687B1 (en) * 1999-08-06 2003-09-23 Milwaukee School Of Engineering Process of making a three-dimensional object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6623681B1 (en) * 1997-05-20 2003-09-23 Toray Industries, Inc. Polyester fiber and process for preparing the same
US6405095B1 (en) * 1999-05-25 2002-06-11 Nanotek Instruments, Inc. Rapid prototyping and tooling system
US6623687B1 (en) * 1999-08-06 2003-09-23 Milwaukee School Of Engineering Process of making a three-dimensional object
US20020012454A1 (en) * 2000-03-09 2002-01-31 Zicheng Liu Rapid computer modeling of faces for animation

Also Published As

Publication number Publication date
WO2005089307A3 (en) 2009-04-02
WO2005089307A2 (en) 2005-09-29
WO2005089265A3 (en) 2009-04-16

Similar Documents

Publication Publication Date Title
US20180165730A1 (en) Made-to-order direct digital manufacturing enterprise
US20230334545A1 (en) Systems and methods for creating 3d objects
US7656402B2 (en) Method for creating, manufacturing, and distributing three-dimensional models
US9928544B1 (en) Vehicle component installation preview image generation
JP6041958B2 (en) Improved user interface for object design
KR20060050970A (en) Product design method, product design apparatus, product design system, and product design program
CN102158628B (en) Album creating apparatus and album creating method
CN106991723A (en) Interactive house browsing method and system of three-dimensional virtual reality
US20120310768A1 (en) Order Fulfillment and Content Management Systems and Methods
CN108454114A (en) A kind of customization platform and its method for customizing for 3D printing
US10586262B2 (en) Automated system and method for the customization of fashion items
US10286605B1 (en) Identifiable information for three-dimensional items
US9639924B2 (en) Adding objects to digital photographs
CN103065248B (en) A kind of network menu self-help design method and system thereof
CN102209973B (en) Method and system for facilities management
US20060012081A1 (en) Custom prototyping
US20060031252A1 (en) Personalized prototyping
JP2023550884A (en) Systems and methods for automatically configuring custom product selections based on user interaction
KR101748245B1 (en) Method for providing 3d printing data service
WO2005089265A2 (en) Personalized prototyping
JP2004054363A (en) Electronic catalog device for mold part
US7894924B2 (en) System and method for internet based automated memorial design and manufacturing
US20180121995A1 (en) Systems and methods for managing three dimensional manufacturing
US20180116349A1 (en) Modular image display
USRE46807E1 (en) Made to order digital manufacturing enterprise

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase