US8115090B2 - Mashup data file, mashup apparatus, and content creation method - Google Patents

Mashup data file, mashup apparatus, and content creation method Download PDF

Info

Publication number
US8115090B2
US8115090B2 US12/312,947 US31294707A US8115090B2 US 8115090 B2 US8115090 B2 US 8115090B2 US 31294707 A US31294707 A US 31294707A US 8115090 B2 US8115090 B2 US 8115090B2
Authority
US
United States
Prior art keywords
piece
mashup
content
data
pieces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/312,947
Other versions
US20100064882A1 (en
Inventor
Yasushi Miyajima
Yoichiro Sako
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAKO, YOICHIRO, MIYAJIMA, YASUSHI
Publication of US20100064882A1 publication Critical patent/US20100064882A1/en
Application granted granted Critical
Publication of US8115090B2 publication Critical patent/US8115090B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/38Chord
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • G10H2210/125Medley, i.e. linking parts of different musical pieces in one single piece, e.g. sound collage, DJ mix
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
    • G10H2240/135Library retrieval index, i.e. using an indexing scheme to efficiently retrieve a music piece

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Theoretical Computer Science (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Reverberation, Karaoke And Other Acoustics (AREA)

Abstract

There are provided recording means for recording first data (12A) or (12B) used for dividing each of a first piece of content and a second piece of content into a plurality of blocks in accordance with each piece of contents and recording means (11A) or (11B) for recording second data indicating a sequence for arranging the plurality of blocks to create a new piece of content. The first piece of content and the second piece of content are mashed up using the first data and the second data, and a result of mashup processing is output. As a result, it is possible to mash up the first piece of content and the second piece of content without special knowledge.

Description

CROSS-REFERENCES TO RELATED APPLICATIONS
The present application is a national phase entry under 35 U.S.C. §371 of International Application No. PCT/JP2007/069680 filed Oct. 9, 2007, published on Jun. 5, 2008 as WO 2008/065808 A1, which claims priority from Japanese Patent Application No. JP 2006-319641 filed in the Japanese Patent Office on Nov. 28, 2006.
TECHNICAL FIELD
The present invention relates to a mashup data file, a mashup apparatus, and a content creation method.
BACKGROUND ART
Processing for overlaying parts or all of a plurality of music pieces is called remixing. The types of remixing include “mashup” processing in which more sophisticated processing is performed. In the mashup processing, for example, when a music piece A and a music piece B are remixed, a new music piece is created by making the tempos of both of these music pieces (the speeds at which these music pieces are played), the numbers of beats (for example, the numbers of quarter notes) of both of these music pieces, the keys (the pitches of essential notes, such as C major or D minor) of both of these music pieces, and the chord progressions (chord changing states) of both of these music pieces conform to each other, and cutting and pasting only appropriate portions of these music pieces.
Not only a person who created a music piece by performing the mashup processing but also the person's friend sometimes enjoys listening to the music piece.
Related art documents include, for example, Japanese Unexamined Patent Application Publication No. 2006-107693.
DISCLOSURE OF INVENTION
However, only some users who are very familiar with music can perform mashup processing.
Furthermore, a new music piece created by mashing up original music pieces may infringe the copyrights of the original music pieces.
The present invention addresses the above-described problems.
The present invention provide a mashup data file used for mashing up at least a first piece of content and a second piece of content. The mashup data file includes: first data used for dividing each of the first piece of content and the second piece of content into a plurality of blocks in accordance with each piece of contents; and second data indicating a sequence for arranging the plurality of blocks to create a new piece of content.
According to the present invention, it is possible to perform mashup processing without special knowledge for mashup processing. Furthermore, it is possible to reprocess or develop a result of the mashup processing.
In that case, users who will listen to a mashup music piece, which is the result of mashup processing, prepare original music pieces. The prepared original music pieces are simply reproduced in accordance with a recipe. Accordingly, the possibility that the infringement of copyrights of the original music pieces will occur is low.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a diagram illustrating a display screen according to an embodiment of the present invention.
FIG. 2 is a diagram illustrating a display screen according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating a display screen according to an embodiment of the present invention.
FIG. 4 is an enlarged view of a display example.
FIG. 5 is a diagram illustrating an application example.
FIG. 6 is a diagram illustrating an example of use.
FIG. 7 is a diagram illustrating an example of a concept of data.
FIG. 8 is a diagram illustrating an example of a data table and exemplary contents of the data table.
FIG. 9 is a diagram illustrating an example of a personal computer applying the present invention.
FIG. 10 is a diagram illustrating an example of data provided in a storage.
FIG. 11 is a flowchart illustrating an example of a mashup processing method.
FIG. 12 is a flowchart illustrating another example of a mashup processing method.
FIG. 13 is a schematic diagram illustrating an example of a preprocessing method.
FIG. 14 is a waveform diagram illustrating an example of a preprocessing method.
BEST MODES FOR CARRYING OUT THE INVENTION [1] Overview of Display Example
FIGS. 1 to 3 illustrate display examples of a display screen in a case where mashup processing according to the present invention is performed. In these examples, a personal computer performs mashup processing so as to create a new music piece from two music pieces, music pieces A and B. However, the mashup processing may be performed using three music pieces or more. Sections such as an introduction (intro), a verse A, a verse B, a chorus, a bridge, and an ending (outro) into which a music piece can be separated are hereinafter referred to as “blocks”.
In FIGS. 1 to 3, a reference numeral 10 represents a display screen when mashup processing is performed. FIG. 1 illustrates a screen when mashup processing is started, FIG. 2 illustrates a screen while the mashup processing is being performed, and FIG. 3 illustrates a screen when the mashup processing has finished.
The display screen 10 is divided into two areas, an upper area 11 and a lower area 12. In the upper area 11, information about a result of mashup processing is displayed. In the lower area 12, pieces of information about the original music pieces (music pieces) A and B to be used for mashup processing are displayed.
Accordingly, the upper area 11 is provided with a strip area 11A in which a block of the original music piece A is to be copied and a strip area 11B in which a block of the original music piece B is to be copied. The tracks 11A and 11B are provided in the area 11 so that they are parallel to each other and the horizontal direction is a time axial direction. The areas 11A and 11B are hereinafter referred to as tracks 11A and 11B, respectively, in a similar manner to tracks of a magnetic tape.
Furthermore, at the bottom of the track 11B, a performance time scale 11S is displayed. At the starting position of the performance time scale 11S, a pointer 11P indicating a performance time position is displayed.
Still furthermore, a stripe track 12A indicating information about the original music piece A and a stripe track 12B indicating information about the original music piece B are formed in the area 12 so that they are parallel to each other and the horizontal direction is a time axial direction. FIG. 1 illustrates an initial screen on which the pieces of information about the original music pieces A and B are provided in the tracks 12A and 12B, respectively (when these pieces of information are not provided, they are empty like the tracks 11A and 11B).
In this case, the track 12A is divided into two tracks, an upper track 12AM called “melody track” and a lower track 12AC called “chord track”. The melody track 12AM is separated into blocks of the original music piece A, that is, an introduction, a verse A, a verse B, a chorus, a bridge, an ending (outro), etc. Each of these blocks is provided with data of a corresponding melody. Furthermore, as illustrated in FIG. 4 in such a manner that the time axis (in the horizontal direction) is enlarged, the chord track 12AC is separated into blocks in accordance with the chord progression of the original music piece A. Each of these blocks is provided with data of a corresponding chord.
Still furthermore, the track 12B has the same structure as that of the track 12A. A melody track 12BM is separated into blocks of the original music piece B, and each of these blocks is provided with data of a corresponding melody. A chord track 12BC is separated into blocks in accordance with the chord of the original music piece B, and each of these blocks is provided with data of a corresponding chord.
Furthermore, as indicated by reference numerals 13A and 13B, the artist names, titles, tempos, and bar numbers of the original music pieces A and B are displayed on the upper sides of the tracks 12A and 12B, respectively. If there are three original music pieces or more, the tracks of all of these original music pieces, for example, the tracks 12A and B, etc., can be used by operating a scroll button 14S or a scroll bar 14V, which is displayed on the right-hand side of the area 12, using a mouse to scroll through the area 12 in the vertical direction.
Incidentally, even if the time length of the tracks 12AM and 12AC differs from that of the tracks 12BM and 12BC in reality, they are displayed as having the same standardized time length. Each block included in these tracks is displayed with a color corresponding to contents of the block. Furthermore, the pieces of digital audio data of the original music pieces A and B and the pieces of information to be displayed in the tracks 12A and 12B are obtained, for example, via a download site, the Internet, or the like, and are then provided on a personal computer in advance.
Furthermore, at the top of the display screen 10, a reproduction button 15P used to perform test-listening of the progress or result of mashup processing, a stop button 15S used to stop the reproduction of the progress or result of mashup processing, an input field 16 used to specify a reproduction tempo, a save button 17 used to store information about a result of mashup processing, etc. are displayed.
[2] Mashup Processing Method
In a display state illustrated in FIG. 1, for example, if the “verse A” included in the melody track 12AM of the original music piece A is dragged and dropped or is copied and pasted at the beginning of the track 11A as indicated by an arrow A1 in FIG. 2, the “verse A” of the original music piece A is copied to the beginning of the track 11A. Referring to FIG. 2, as indicated by an arrow A2, the “verse A” is repeatedly copied to the track 11A.
Furthermore, as indicated by an arrow B1 in FIG. 2, the “verse A” included in the melody track 12BM of the original music piece B is dragged and dropped or is copied and pasted at the beginning of the track 11B, the “verse A” of the original music piece B is copied to the beginning of the track 11B.
Subsequently, by performing an operation similar to the above-described operation, a predetermined block included in any one of the blocks 12AM to 12BC is copied to the track 11A or 11B. As a result, for example, the tracks 11A and 11B can have data as illustrated in FIG. 3.
Subsequently, if a user clicks on the reproduction button 15P, a music piece created in the track 11A and a music piece created in the track 11B are mixed and are then reproduced. Accordingly, the user can test-listen to a mashup music piece.
Note that, in the reproduction of the music piece that is a mashup processing result, the digital audio data itself of the mashup processing result is not reproduced. The blocks of the original music pieces A and B are selectively extracted or processed in real time on the basis of the pieces of data displayed in the tracks 11A and 11B and are then reproduced.
If the user is not satisfied with the mashup music piece, the user can delete, copy, or move a corresponding block included in the track 11A or 11B using a mouse and also can copy a new block from any one of the tracks 12AM to 12BC to the track 11A or 11B. That is, the tracks 11A and 11B can be individually edited on a block-by-block basis.
Thus, by performing the copying of blocks from the tracks 12AM to 12BC to the tracks 11A and 11B, the editing of the tracks 11A and 11B, and the test-listening of the result of these processing operations, the user can obtain a desired mashup music piece that is a result of mashup processing.
If the user clicks on the save button 17 after the creation of a new mashup music piece has been completed, instead of the digital audio data itself of the mashup music piece that is a result of mashup processing, information required to reproduce the mashup music piece (the pieces of data displayed in the tracks 11A and 11B) is stored in a large-capacity storage included in a personal computer, for example, a hard disk drive. That is, information indicating the original music piece A, information indicating the original music piece B, pieces of information about blocks of the original music pieces A and B to be used for mashup processing, pieces of information about the time positions and chronological sequences of the blocks at the time of use of the blocks, information about the tempos of the blocks, etc. are stored in the hard disk drive as a file.
Incidentally, in a case where a meal is cooked, food items are prepared in accordance with a recipe and are then cooked using a procedure and a method which are indicated by the recipe. The above-described mashup processing method is similar to cooking. Accordingly, information required for mashup processing (in the above-described case, information stored in the hard disk drive) is hereinafter referred to as a “recipe”. That is, pieces of digital audio data of music pieces and pieces of metadata of the music pieces are prepared in accordance with a recipe, and are then edited using a procedure and a method which are indicated by the recipe. Consequently, a mashup music piece is created.
[3] Reproduction of Mashup Music Piece
In this case, the pieces of digital audio data of the target original music pieces A and B and a recipe file are prepared in a personal computer. If a program for the recipe is executed, the original music pieces A and B are automatically processed in accordance with the recipe, that is, on the basis of the tracks 11A and 11B illustrated in FIG. 1, and a mashup music piece is output as sound.
Accordingly, for example, as illustrated in FIG. 5, if this recipe is distributed among a plurality of n players (users) via a P2P network or is distributed from a server, a homepage, or the like to the n players (users) via a network, the users that have received the recipe can listen to a mashup music piece. In that case, the users that will listen to the mashup music piece prepare the original music pieces A and B. The prepared original music pieces A and B are simply reproduced in accordance with the received recipe. Accordingly, the possibility that the infringement of copyrights of the original music pieces A and B will occur is low. Conversely, this can contribute to the sales of the original music pieces A and B.
For example, as illustrated in FIG. 6, if mashup processing is performed using current hit music pieces 21A to 21C and a past music piece 21D, users can enjoy a new mashup music piece. Furthermore, according to this mashup system, the past music piece whose sales remain at a low level can attract attention again. Thus, the mashup system can contribute to sales.
Furthermore, special knowledge is not required for mashup processing. Still furthermore, a user can modify or develop a result of the mashup processing for the user's own use by processing the recipe.
[4] Data Structure in Recipe
FIG. 7 is a diagram illustrating the concept of the tracks 11A and 11B illustrated in FIG. 3. The track 11A includes a plurality of blocks, BLK_A1, BLK_A2, BLK_A3, etc. obtained from mashup processing. The track 11B includes a plurality of blocks, BLK_B1, BLK_B2, BLK_B3, etc. obtained from mashup processing. As illustrated in FIG. 7, the lengths (time lengths) of these blocks differ from each other. The blocks are numbered in consecutive order, but it does not necessarily mean that the blocks are consecutive blocks in an original music piece.
The blocks included in the tracks 11A and 11B are specified in accordance with, for example, a recipe illustrated in FIG. 8. That is, as illustrated in FIG. 8A, a track table TRKTBL is prepared. The track table TRKTBL includes a data area #NUM indicating the number of tracks such as the tracks 11A and 11B included in the upper area 11 of the display screen 10, a data area #A indicating pieces of information about blocks in the track 12A, and a data area #B indicating pieces of information about blocks in the track 12B.
In the case of the examples illustrated in FIGS. 1 to 3, the number of tracks indicated in the data area #NUM is two. Accordingly, the track table TRKTBL includes the data area #A for the first track 11A and the data area #B for the second track 11B.
The data area #A for the first track 11A includes a data field #A0 indicating the number of blocks included in the first track 11A and a data field #Ai indicating information about a block BLK_Ai (i=1, 2, . . . ) included in the first track 11A. Each of the data fields #A0 and #Ai stores corresponding pieces of data.
The data area #B for the second track 11B similarly includes a data field #B0 indicating the number of blocks included in the second track 11B and a data field #Bj indicating information about a block BLK_Bj (j=1, 2, . . . ) included in the second track 11B. Each of the data fields #B0 and #Bj stores corresponding pieces of data. If there are the third track and the subsequent tracks, a data area is similarly prepared for each track. The data area includes data fields each storing corresponding pieces of data.
For example, each of the data fields #Ai and #Bj in the data areas #A and #B stores pieces of data illustrated in FIG. 8B. That is, referring to FIG. 8B, pieces of data in the first to ninth lines are pieces of information about an original music piece to be used for mashup processing, and pieces of data in the remaining four lines are pieces of information about a mashup music piece that is a result of the mashup processing.
The “music piece ID” in the first line is an identification code used to identify an original music piece (in this case, the original music piece A or B). Furthermore, since the tempo of a music piece generally varies on a block-by-block basis, the “original tempo of block” indicates the tempo of a corresponding block of the original music piece and the “meter of block” indicates the meter of the corresponding block such as ½ or ¾.
Still furthermore, the “key and scale of block” indicates information to be used for modulation. Since it sometimes happens that an original music piece is used for mashup processing from the middle of a bar thereof or until the middle of a bar thereof, the “original music piece sample start and end points of block” indicates the use start and end points that are counted by sample unit in the original music piece. Furthermore, the “beat count at start point” and the “beat count at end point” indicate, using a bar and a beat, a position in the original music piece from which the corresponding block starts and a position in the original music piece at which the corresponding block ends, respectively (for example, the third beat in the tenth bar).
The “sample position indicating beginning of bar immediately before start point” indicates the position of the beginning of a bar immediately before a point from which the use of the original music piece is started for mashup processing, which is counted by sample unit in the original music piece. The “sample position indicating beginning of bar immediately after end point” indicates the position of the beginning of a bar immediately after a point at which the use of the original music piece is finished for mashup processing, which is counted by sample unit in the original music piece.
Accordingly, a user can know which part (which of blocks) of the original music piece is required for mashup processing by sample unit using the above-described pieces of information in the first to ninth lines.
Furthermore, referring to FIG. 8B, the “start sample position in mashup” and the “end sample position in mashup” indicate a boundary point between a certain block and the next block in a mashup music piece (see, for example, FIG. 7) that is a result of mashup processing, that is, indicate the start and end points of a certain block, respectively, which are counted by sample unit. The “start bar number in mashup” and the “end bar number in mashup” indicate the start and end bar numbers of a mashup music piece that is a result of mashup processing, respectively.
Accordingly, using the above-described pieces of information in the tenth to thirteenth lines, a user can know a structure used when mashup processing is performed using a block obtained from the original music piece.
Thus, by using the track table TRKTBL illustrated in FIG. 8, a user can extract blocks required for mashup processing from an original music piece on a sample-by-sample basis and combine the extracted blocks so as to create a new mashup music piece. That is, the track table TRKTBL can be used as a recipe.
[5] Example of Hardware
FIG. 9 illustrates an example of a case in which a personal computer performs the above-described mashup processing. A personal computer 100 is configured in the same manner as a general personal computer, and has a CPU 101, a ROM 102, a nonvolatile memory 103, and a RAM 104.
In this case, the CPU 101 executes various programs, and each of the ROM 102 and the memory 103 stores a BIOS to be executed by the CPU 101 and basic data. The RAM 104 functions as a work area when the CPU 101 executes a program. These memories 102 to 104 are connected to the CPU 101 via a system bus 109.
Furthermore, a hard disk drive 105 that is a large-capacity storage is connected to the system bus 109. In this case, as illustrated in FIG. 10, for example, the hard disk drive 105 stores an OS used to operate the personal computer 100, a routine for performing mashup processing, digital audio data of an original music piece (music piece) to be subjected to mashup processing, metadata of the original music piece, and data of a result of mashup processing, that is, the track table TRKTBL (recipe) described previously with reference to FIG. 8. The metadata includes general pieces of data of a corresponding music piece (a music piece ID, a music piece title, an artist name, etc.) and various pieces of data such as tempo data, key data, meter data, and chord data which are required for the creation of a recipe.
User interfaces such as a keyboard 106 functioning as character input means and a mouse 107 functioning as a pointing device and a CD drive device 108 functioning as external digital audio data input means are connected to the system bus 109.
Furthermore, a communication interface circuit 111 is connected to the system bus 109. The personal computer 100 is connected to an external network, for example, the Internet 120, via the communication interface circuit 111. The network 120 is connected to a server 130 storing the pieces of digital audio data and metadata of the above-described original music pieces.
For example, as illustrated in FIG. 10, the pieces of digital audio data and metadata of the original music pieces, which are stored in the server 130, are downloaded via the Internet 120 and the communication circuit 111, and are then stored in the hard disk drive 105. A personal computer or player having a configuration similar to that of the personal computer 100 may be connected to the network 120 as illustrated in FIG. 5.
The personal computer 100 also includes an audio reproduction circuit 112 and a display control circuit 114, and these circuits are also connected to the system bus 109. The audio reproduction circuit 112 receives digital audio data, performs decoding of the digital audio data in accordance with MP3 as appropriate, performs D/A conversion so as to convert the digital audio data into an analog audio signal, and supplies the analog audio signal to a speaker 113.
The display control circuit 114 includes a video RAM (not illustrated) to which display data is supplied. The display data is repeatedly read from the video RAM at a predetermined cycle, and is then converted into a video signal. The video signal is supplied to a display 115. The display 115 displays an image, for example, the image illustrated in FIG. 1, 2, or 3, on the basis of the supplied video signal.
[6] Example of Creation Routine
Referring to FIG. 11, a reference numeral 200 indicates an example of a routine for creating a new music piece by performing mashup processing. As illustrated in FIG. 10, the creation routine 200 is provided in the hard disk drive 105. In FIG. 11, only a part of the routine 200 which is related to the present invention is illustrated.
In the following description, it is assumed that the digital audio data of an original music piece required for mashup processing is copied from a CD (not illustrated) by the CD drive device 108 and is then stored in the hard disk drive 105 in advance, or is downloaded from the server 130 by the communication circuit 111 and is then stored in the hard disk drive 105 in advance.
When mashup processing is performed, the CPU 101 executes the routine 200 as follows. That is, if a user provides an instruction for the execution of the routine 200 using the keyboard 106 or the mouse 107, the CPU 101 starts the routine 200 from step S201. Subsequently, in step 202, after various initial settings have been performed, an original music piece selection mode is set.
In this selection mode, for example, a list of titles of music pieces stored in the hard disk drive 105 is displayed. Accordingly, the user selects music pieces from among the music pieces as original music pieces to be used for mashup processing using the keyboard 106 or the mouse 107. For example, the user selects music pieces A and B.
Subsequently, in step S203, it is determined whether the pieces of metadata of the music pieces A and B selected in step S202 are stored in the hard disk drive 105. If it is determined that the pieces of metadata of the music pieces A and B are not stored in the hard disk drive 105, the process proceeds from step S203 to step S204. In step S204, the pieces of metadata of the music pieces A and B are downloaded from the server 130 and are then stored in the hard disk drive 105. Subsequently, the process proceeds to step S211.
If it is determined in step S203 that the pieces of metadata of the music pieces A and B selected in step S202 are stored in the hard disk drive 105, the process proceeds from step S203 to step S211.
Thus, the music pieces A and B to be used for mashup processing and the pieces of metadata of the music pieces A and B are prepared. Subsequently, in step S211, the procedures and processing operations which have been described in the above descriptions [1] and [2], that is, mashup processing, is executed. Consequently, for example, the track table TRKTBL illustrated in FIG. 8, that is, a recipe, is created.
In this case, the determination of where to locate a block of the original music piece A or B in the track 11A or 11B is performed in accordance with a user's instruction. Furthermore, while the mashup processing is being performed, the user can test-listen to a result of the mashup processing by clicking on the reproduction button 15P. Still furthermore, during the reproduction of a result of the mashup processing, the pointer 11P moves to indicate a reproduction position.
If the user clicks on the save button 17 after the mashup processing has been completed, the recipe created in step S211 is stored in the hard disk drive 105. Subsequently, in step S213, the routine 200 ends.
Thus, according to the routine 200, mashup processing can be performed so as to create a recipe.
[7] Another Example of Creation Routine
Referring to FIG. 12, a reference numeral 300 represents another example of a routine for reproducing a new music piece by performing mashup processing. As illustrated in FIG. 10, the reproduction routine 300 is provided in the hard disk drive 105. In FIG. 12, only a part of the routine 300 which is related to the present invention is illustrated.
When mashup processing is performed, the CPU 101 executes the routine 300 as follows. That is, if a user provides an instruction for the execution of the routine 300 using the keyboard 106 or the mouse 107, the CPU 101 starts the routine 300 from step S301. Subsequently, in step 302, the personal computer 100 is connected to the server (site) 130 for mashup processing via the network 120.
Subsequently, in step S303, the pieces of digital audio data of music pieces to be used for mashup processing are selected. In step S304, a recipe for mashup processing (the track table TRKTBL) is selected. In step S305, the pieces of digital audio data of music pieces selected in step S303 and the recipe selected in step S304 are downloaded, and are then stored in the hard disk drive 105.
Thus, the music pieces A and B to be used for mashup processing and the recipe for the mashup processing are prepared. Subsequently, in step S311, the pieces of digital audio data of the Music pieces which have been downloaded in step S305 are mashed up in accordance with the recipe downloaded in step S305. By clicking on the reproduction button 15P, a result of the mashup processing is reproduced. Furthermore, if necessary, a user can change the result of mashup processing using the procedures or methods described in the descriptions [1] and [2].
If the user clicks on the save button 17 after the mashup processing has been completed, the created recipe is stored in the hard disk drive 105 in step S312. Subsequently, in step S313, the routine 300 ends.
Thus, according to the routine 300, a mashup music piece can be obtained. Furthermore, by additionally performing mashup processing, a recipe can be obtained.
[8] Preprocessing
In order to achieve the above-described mashup processing, it is generally required that the tempos, keys, and beat positions of the original music pieces A and B be the same. However, the tempos, keys, and beat positions of the music pieces A and B are often different from each other. Furthermore, in some cases, it is more effective to change the tempo and key of a mashup music piece that is a result of mashup processing.
Accordingly, the tempo and key of an original music piece is changed, and this change can be executed using a technique disclosed in Japanese Patent Application No. 2004-269085. That is, the tempo and meter of an original music piece can be detected from the sound level and peak sound of the original music piece. Tempo control is performed so that the beat start position (that is, the start position of a bar) of one of original music pieces or a desired mashup music piece that is a result of mashup processing and the beat start position of the other one of the original music pieces are the same. Furthermore, key control (modulation processing) is performed.
As illustrated in FIGS. 13 and 14A, samples included in digital audio data DORG of an original music piece are sequentially written in a memory MM. Subsequently, as illustrated in FIG. 14B, the written samples included in the digital audio data DORG are decimated, for example, at a rate of one sample per two samples and the decimated samples are repeatedly read two times. Consequently, digital audio data DCHG including the read samples is obtained. In the digital audio data DCHG, a frequency of the original music piece becomes twice that of the original music piece in the digital audio data DORG. That is, the sound pitch of the original music piece in the digital audio data DCHG is one octave higher than that of the original music piece in the digital audio data DORG. Accordingly, by setting a decimation rate used when the digital audio data DCHG is read from the memory MM or the number of reading repetitions, a desired tempo and a desired key can be obtained.
[9] Conclusion
According to the above-described system, it is possible to perform mashup processing without special knowledge for mashup processing. By reprocessing a recipe or a result of mashup processing, it is further possible to modify or develop the result of mashup processing.
Furthermore, a recipe can be distributed among a plurality of players (users) via a storage medium, a P2P network, a server, a homepage, or the like. Users who have received the recipe can listen to a mashup music piece. In that case, users who will listen to the mashup music piece prepare the original music pieces A and B. The prepared original music pieces A and B are simply reproduced in accordance with the recipe. Accordingly, the possibility that the infringement of copyrights of the original music pieces A and B will occur is low. Conversely, this system can contribute to sales of the original music pieces A and B.
[10] Others
In the above-described description, it is assumed that the music piece A is a vocal solo and the music piece B is a musical scale played by various instruments using various chords. In this case, a musical accompaniment may be added to the music piece A. Alternatively, one of the music pieces A and B may be used as various types of effect sound. If the music piece B=the music piece A, that is, only the music piece A is selected, mashup processing can be performed using only blocks of the music piece A.
Furthermore, in the above-described description, a plurality of music pieces such as the music pieces A and B are mashed up. However, the present invention can be applied to the following cases: a case in which a plurality of pieces of contents, for example, a plurality of moving images, are mashed up; a case in which musical sound, a speech, and effect sound are added to a moving image; and a case in which a plurality of writings are mashed up. Furthermore, when mashup processing is performed, reverberation processing, echo processing, equalizer processing, etc. can be performed at the same time.
[List of Abbreviations]
BIOS: Basic Input/Output System
CD: Compact Disc
CPU: Central Processing Unit
D/A: Digital to Analog
MP3: MPEG-1/Audio Layer 3
MPEG: Motion Picture Experts Group
OS: Operating System
P2P: Peer to Peer
RAM: Random Access Memory
ROM: Read-Only Memory

Claims (5)

The invention claimed is:
1. A mashup data file used for mashing up at least a first piece of content and a second piece of content, comprising:
first data used for dividing each of the first piece of content and the second piece of content into a plurality of blocks in accordance with each piece of contents; and
second data indicating a sequence for arranging the plurality of blocks to create a new piece of content,
in which the first data includes for each block information indicative of a tempo, information usable for modulation, and information indicative of a bar and beat at a start point and at an end point.
2. The mashup data file according to claim 1, wherein each of the first piece of content and the second piece of content is a sound piece of content, an image piece of content, or a character piece of content.
3. The mashup data file according to claim 1, wherein the first piece of content and the second piece of content are the same.
4. A content creation method comprising:
mashing up a first piece of content and a second piece of content using first data used for dividing each of the first piece of content and the second piece of content into a plurality of blocks in accordance with each piece of contents and second data indicating a sequence for arranging the plurality of blocks to create a new piece of content,
in which the first data includes for each block information indicative of a tempo, information usable for modulation, and information indicative of a bar and beat at a start point and at an end point.
5. A mashup apparatus for mashing up at least a first piece of content and a second piece of content comprising:
recording means for recording first data used for dividing each of the first piece of content and the second piece of content into a plurality of blocks in accordance with each piece of contents; and
recording means for recording second data indicating a sequence for arranging the plurality of blocks to create a new piece of content, and
wherein the first piece of content and the second piece of content are mashed up using the first data and the second data, and a result of mashup processing is output, and
in which the first data includes for each block information indicative of a tempo, information usable for modulation, and information indicative of a bar and beat at a start point and at an end point.
US12/312,947 2006-11-28 2007-10-09 Mashup data file, mashup apparatus, and content creation method Expired - Fee Related US8115090B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2006319641A JP5259075B2 (en) 2006-11-28 2006-11-28 Mashup device and content creation method
JP2006-319641 2006-11-28
PCT/JP2007/069680 WO2008065808A1 (en) 2006-11-28 2007-10-09 Mashing-up data file, mashing-up device and contents making-out method

Publications (2)

Publication Number Publication Date
US20100064882A1 US20100064882A1 (en) 2010-03-18
US8115090B2 true US8115090B2 (en) 2012-02-14

Family

ID=39467605

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/312,947 Expired - Fee Related US8115090B2 (en) 2006-11-28 2007-10-09 Mashup data file, mashup apparatus, and content creation method

Country Status (6)

Country Link
US (1) US8115090B2 (en)
EP (1) EP2099023B1 (en)
JP (1) JP5259075B2 (en)
CN (1) CN101542588B (en)
TW (1) TW200828263A (en)
WO (1) WO2008065808A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100186579A1 (en) * 2008-10-24 2010-07-29 Myles Schnitman Media system with playing component
WO2014028891A1 (en) * 2012-08-17 2014-02-20 Be Labs, Llc Music generator
US20140076125A1 (en) * 2012-09-19 2014-03-20 Ujam Inc. Adjustment of song length
US20150128788A1 (en) * 2013-11-14 2015-05-14 tuneSplice LLC Method, device and system for automatically adjusting a duration of a song
US9111519B1 (en) * 2011-10-26 2015-08-18 Mixwolf LLC System and method for generating cuepoints for mixing song data
US20190236209A1 (en) * 2018-01-29 2019-08-01 Gary Bencar Artificial intelligence methodology to automatically generate interactive play along songs
US10679596B2 (en) 2018-05-24 2020-06-09 Aimi Inc. Music generator
US11635936B2 (en) 2020-02-11 2023-04-25 Aimi Inc. Audio techniques for music content generation

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5259083B2 (en) 2006-12-04 2013-08-07 ソニー株式会社 Mashup data distribution method, mashup method, mashup data server device, and mashup device
JP4933932B2 (en) 2007-03-23 2012-05-16 ソニー株式会社 Information processing system, information processing apparatus, information processing method, and program
JP4367662B2 (en) * 2007-03-23 2009-11-18 ソニー株式会社 Information processing system, terminal device, information processing method, program
US8173883B2 (en) * 2007-10-24 2012-05-08 Funk Machine Inc. Personalized music remixing
US8458600B2 (en) * 2009-12-31 2013-06-04 International Business Machines Corporation Distributed multi-user mashup session
US8458221B2 (en) 2010-10-13 2013-06-04 Sony Corporation Method and system and file format of generating content by reference
US8426716B2 (en) * 2011-01-07 2013-04-23 Apple Inc. Intelligent keyboard interface for virtual musical instrument
WO2012170904A2 (en) * 2011-06-10 2012-12-13 Bytemobile, Inc. Adaptive bitrate management on progressive download with indexed media files
US10971191B2 (en) * 2012-12-12 2021-04-06 Smule, Inc. Coordinated audiovisual montage from selected crowd-sourced content with alignment to audio baseline
JP5846288B2 (en) * 2014-12-26 2016-01-20 ヤマハ株式会社 Phrase data search device and program
GB2538994B (en) 2015-06-02 2021-09-15 Sublime Binary Ltd Music generation tool
GB2557970B (en) * 2016-12-20 2020-12-09 Mashtraxx Ltd Content tracking system and method
US11024276B1 (en) * 2017-09-27 2021-06-01 Diana Dabby Method of creating musical compositions and other symbolic sequences by artificial intelligence
CN108959500A (en) * 2018-06-26 2018-12-07 郑州云海信息技术有限公司 A kind of object storage method, device, equipment and computer readable storage medium
JP6683322B2 (en) * 2018-10-11 2020-04-15 株式会社コナミアミューズメント Game system, game program, and method of creating synthetic music
JP7439755B2 (en) * 2018-10-19 2024-02-28 ソニーグループ株式会社 Information processing device, information processing method, and information processing program

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH076512A (en) 1993-06-17 1995-01-10 Pioneer Electron Corp Information reproducing device
JP2001257967A (en) 2000-03-09 2001-09-21 Sharp Corp Information editing service system
JP2003108132A (en) 2001-09-28 2003-04-11 Pioneer Electronic Corp Device and system for audio information reproduction
JP2004269085A (en) 2003-03-05 2004-09-30 Shikoku Kakoki Co Ltd Container carrying conveyer device
JP2005020202A (en) 2003-06-24 2005-01-20 Canon Inc Reproduction apparatus, reproduction method, and recording medium and program for them
JP2006030538A (en) 2004-07-15 2006-02-02 Yamaha Corp Musical piece data editing/reproducing device and mobile information terminal using same
JP2006047644A (en) 2004-08-04 2006-02-16 Denso Corp Exchange system for lists of musical piece, video content, electronic book, and web content, and server and terminal device used therefor
US20060054005A1 (en) 2004-09-16 2006-03-16 Sony Corporation Playback apparatus and playback method
JP2006107693A (en) 2004-09-10 2006-04-20 Sony Corp Recording medium, recording device, recording method, data output device, data output method, and data distribution/circulation system
WO2006114998A1 (en) 2005-04-25 2006-11-02 Sony Corporation Musical content reproducing device and musical content reproducing method
US20060263037A1 (en) * 2005-05-23 2006-11-23 Gilley Thomas S Distributed scalable media environment
JP2006337914A (en) 2005-06-06 2006-12-14 Kddi Corp Music player capable of musical piece remixing, musical piece remixing method, and program
US20070300258A1 (en) * 2001-01-29 2007-12-27 O'connor Daniel Methods and systems for providing media assets over a network
US20080016114A1 (en) * 2006-07-14 2008-01-17 Gerald Thomas Beauregard Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
US20080127812A1 (en) * 2006-12-04 2008-06-05 Sony Corporation Method of distributing mashup data, mashup method, server apparatus for mashup data, and mashup apparatus
US20090019995A1 (en) * 2006-12-28 2009-01-22 Yasushi Miyajima Music Editing Apparatus and Method and Program
US20090158238A1 (en) * 2007-12-14 2009-06-18 Samsung Electronics Co., Ltd. Method and apparatus for providing api service and making api mash-up, and computer readable recording medium thereof
US20090204594A1 (en) * 2008-02-07 2009-08-13 Rama Kalyani Akkiraju Recommendation System for Assisting Mashup Developers at Build-Time
US20090287987A1 (en) * 2008-05-19 2009-11-19 Microsoft Corporation Non-destructive media presentation derivatives
US20100125826A1 (en) * 2008-11-18 2010-05-20 Microsoft Corporation Workflow engine for execution of web mashups
US7777121B2 (en) * 2007-08-21 2010-08-17 Sony Corporation Information processing apparatus, information processing method, and computer program
US20100209003A1 (en) * 2009-02-16 2010-08-19 Cisco Technology, Inc. Method and apparatus for automatic mash-up generation

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5693902A (en) * 1995-09-22 1997-12-02 Sonic Desktop Software Audio block sequence compiler for generating prescribed duration audio sequences
JP3053090B1 (en) * 1999-02-26 2000-06-19 コナミ株式会社 Music game system, game control method suitable for the game system, and computer-readable storage medium
JP2002196754A (en) * 2000-10-18 2002-07-12 Victor Co Of Japan Ltd Data compression method, data transmission method and data reproducing method
US6888999B2 (en) * 2001-03-16 2005-05-03 Magix Ag Method of remixing digital information
JP4951833B2 (en) * 2001-09-10 2012-06-13 ソニー株式会社 Display device and method
JP3609045B2 (en) * 2001-10-05 2005-01-12 株式会社河合楽器製作所 Automatic performance device
US7208672B2 (en) * 2003-02-19 2007-04-24 Noam Camiel System and method for structuring and mixing audio tracks
US7642444B2 (en) * 2006-11-17 2010-01-05 Yamaha Corporation Music-piece processing apparatus and method

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH076512A (en) 1993-06-17 1995-01-10 Pioneer Electron Corp Information reproducing device
JP2001257967A (en) 2000-03-09 2001-09-21 Sharp Corp Information editing service system
US20080059989A1 (en) * 2001-01-29 2008-03-06 O'connor Dan Methods and systems for providing media assets over a network
US20070300258A1 (en) * 2001-01-29 2007-12-27 O'connor Daniel Methods and systems for providing media assets over a network
JP2003108132A (en) 2001-09-28 2003-04-11 Pioneer Electronic Corp Device and system for audio information reproduction
US7080016B2 (en) 2001-09-28 2006-07-18 Pioneer Corporation Audio information reproduction device and audio information reproduction system
JP2004269085A (en) 2003-03-05 2004-09-30 Shikoku Kakoki Co Ltd Container carrying conveyer device
JP2005020202A (en) 2003-06-24 2005-01-20 Canon Inc Reproduction apparatus, reproduction method, and recording medium and program for them
JP2006030538A (en) 2004-07-15 2006-02-02 Yamaha Corp Musical piece data editing/reproducing device and mobile information terminal using same
JP2006047644A (en) 2004-08-04 2006-02-16 Denso Corp Exchange system for lists of musical piece, video content, electronic book, and web content, and server and terminal device used therefor
US20080259745A1 (en) 2004-09-10 2008-10-23 Sony Corporation Document Recording Medium, Recording Apparatus, Recording Method, Data Output Apparatus, Data Output Method and Data Delivery/Distribution System
JP2006107693A (en) 2004-09-10 2006-04-20 Sony Corp Recording medium, recording device, recording method, data output device, data output method, and data distribution/circulation system
JP2006084748A (en) 2004-09-16 2006-03-30 Sony Corp Device and method for reproduction
US20060054005A1 (en) 2004-09-16 2006-03-16 Sony Corporation Playback apparatus and playback method
WO2006114998A1 (en) 2005-04-25 2006-11-02 Sony Corporation Musical content reproducing device and musical content reproducing method
US20060265657A1 (en) * 2005-05-23 2006-11-23 Gilley Thomas S Distributed scalable media environment
US20060263037A1 (en) * 2005-05-23 2006-11-23 Gilley Thomas S Distributed scalable media environment
JP2006337914A (en) 2005-06-06 2006-12-14 Kddi Corp Music player capable of musical piece remixing, musical piece remixing method, and program
US20080016114A1 (en) * 2006-07-14 2008-01-17 Gerald Thomas Beauregard Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
US7716572B2 (en) * 2006-07-14 2010-05-11 Muvee Technologies Pte Ltd. Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
US20080127812A1 (en) * 2006-12-04 2008-06-05 Sony Corporation Method of distributing mashup data, mashup method, server apparatus for mashup data, and mashup apparatus
US20090019995A1 (en) * 2006-12-28 2009-01-22 Yasushi Miyajima Music Editing Apparatus and Method and Program
US7626112B2 (en) * 2006-12-28 2009-12-01 Sony Corporation Music editing apparatus and method and program
US7777121B2 (en) * 2007-08-21 2010-08-17 Sony Corporation Information processing apparatus, information processing method, and computer program
US20090158238A1 (en) * 2007-12-14 2009-06-18 Samsung Electronics Co., Ltd. Method and apparatus for providing api service and making api mash-up, and computer readable recording medium thereof
US20090204594A1 (en) * 2008-02-07 2009-08-13 Rama Kalyani Akkiraju Recommendation System for Assisting Mashup Developers at Build-Time
US20090287987A1 (en) * 2008-05-19 2009-11-19 Microsoft Corporation Non-destructive media presentation derivatives
US20100125826A1 (en) * 2008-11-18 2010-05-20 Microsoft Corporation Workflow engine for execution of web mashups
US20100209003A1 (en) * 2009-02-16 2010-08-19 Cisco Technology, Inc. Method and apparatus for automatic mash-up generation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Office Action from Japanese Application No. 2006-319641, dated Jun. 28, 2011.

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8841536B2 (en) * 2008-10-24 2014-09-23 Magnaforte, Llc Media system with playing component
US20100186579A1 (en) * 2008-10-24 2010-07-29 Myles Schnitman Media system with playing component
US9111519B1 (en) * 2011-10-26 2015-08-18 Mixwolf LLC System and method for generating cuepoints for mixing song data
US20210089267A1 (en) * 2012-08-17 2021-03-25 Aimi Inc. Music generator
US10817250B2 (en) 2012-08-17 2020-10-27 Aimi Inc. Music generator
US20150378669A1 (en) * 2012-08-17 2015-12-31 Be Labs, Llc Music generator
US10095467B2 (en) * 2012-08-17 2018-10-09 Be Labs, Llc Music generator
US11625217B2 (en) * 2012-08-17 2023-04-11 Aimi Inc. Music generator
US8812144B2 (en) 2012-08-17 2014-08-19 Be Labs, Llc Music generator
WO2014028891A1 (en) * 2012-08-17 2014-02-20 Be Labs, Llc Music generator
US20140076124A1 (en) * 2012-09-19 2014-03-20 Ujam Inc. Song length adjustment
US9070351B2 (en) * 2012-09-19 2015-06-30 Ujam Inc. Adjustment of song length
US20140076125A1 (en) * 2012-09-19 2014-03-20 Ujam Inc. Adjustment of song length
US9230528B2 (en) * 2012-09-19 2016-01-05 Ujam Inc. Song length adjustment
US20150128788A1 (en) * 2013-11-14 2015-05-14 tuneSplice LLC Method, device and system for automatically adjusting a duration of a song
US9613605B2 (en) * 2013-11-14 2017-04-04 Tunesplice, Llc Method, device and system for automatically adjusting a duration of a song
US10534811B2 (en) * 2018-01-29 2020-01-14 Beamz Ip, Llc Artificial intelligence methodology to automatically generate interactive play along songs
US20200142926A1 (en) * 2018-01-29 2020-05-07 Beamz Ip, Llc Artificial intelligence methodology to automatically generate interactive play along songs
US20190236209A1 (en) * 2018-01-29 2019-08-01 Gary Bencar Artificial intelligence methodology to automatically generate interactive play along songs
US10679596B2 (en) 2018-05-24 2020-06-09 Aimi Inc. Music generator
US11450301B2 (en) * 2018-05-24 2022-09-20 Aimi Inc. Music generator
US11635936B2 (en) 2020-02-11 2023-04-25 Aimi Inc. Audio techniques for music content generation
US11914919B2 (en) 2020-02-11 2024-02-27 Aimi Inc. Listener-defined controls for music content generation
US11947864B2 (en) 2020-02-11 2024-04-02 Aimi Inc. Music content generation using image representations of audio files

Also Published As

Publication number Publication date
WO2008065808A1 (en) 2008-06-05
EP2099023A4 (en) 2015-11-04
JP2008134375A (en) 2008-06-12
CN101542588B (en) 2013-10-23
EP2099023A1 (en) 2009-09-09
TWI348681B (en) 2011-09-11
JP5259075B2 (en) 2013-08-07
US20100064882A1 (en) 2010-03-18
EP2099023B1 (en) 2017-08-23
CN101542588A (en) 2009-09-23
TW200828263A (en) 2008-07-01

Similar Documents

Publication Publication Date Title
US8115090B2 (en) Mashup data file, mashup apparatus, and content creation method
US7956276B2 (en) Method of distributing mashup data, mashup method, server apparatus for mashup data, and mashup apparatus
AU733315B2 (en) Method and apparatus for interactively creating new arrangements for musical compositions
US8173883B2 (en) Personalized music remixing
Goto Active music listening interfaces based on signal processing
US8426715B2 (en) Client-side audio signal mixing on low computational power player using beat metadata
US20120014673A1 (en) Video and audio content system
JP2008287125A (en) Method of displaying content, device of displaying content, recording medium and server device
JP2009529717A (en) Method and apparatus for automatically creating music
Cliff Hang the DJ: Automatic sequencing and seamless mixing of dance-music tracks
CN103718243A (en) Enhanced media recordings and playback
JP2012088402A (en) Information processor, information processing method, and program
US20090292731A1 (en) Method And Apparatus For Generating A Composite Media File
US7612279B1 (en) Methods and apparatus for structuring audio data
JP2001296864A (en) Performance information editing and reproducing device
Cliff hpDJ: An automated DJ with floorshow feedback
EP3926619A1 (en) Information processing device, information processing method, and information processing program
JP3843688B2 (en) Music data editing device
JPH10124075A (en) Text wipe information input device and recording medium
JPH11344975A (en) Musical performance information preparing and displaying device and record medium therefor
Nahmani Logic Pro-Apple Pro Training Series: Professional Music Production
JPH10503851A (en) Rearrangement of works of art
JP2003337586A (en) Performance information preparation display device and recording medium therefor
Dambly Pro Tools 8 for MAC OS X and Windows: Visual Quickstart Guide
JP2005300739A (en) Device for editing musical performance data

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAJIMA, YASUSHI;SAKO, YOICHIRO;SIGNING DATES FROM 20090408 TO 20090422;REEL/FRAME:022790/0136

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAJIMA, YASUSHI;SAKO, YOICHIRO;SIGNING DATES FROM 20090408 TO 20090422;REEL/FRAME:022790/0136

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240214