US20070254737A1 - Contents data processing apparatus and method - Google Patents

Contents data processing apparatus and method Download PDF

Info

Publication number
US20070254737A1
US20070254737A1 US11/827,629 US82762907A US2007254737A1 US 20070254737 A1 US20070254737 A1 US 20070254737A1 US 82762907 A US82762907 A US 82762907A US 2007254737 A1 US2007254737 A1 US 2007254737A1
Authority
US
United States
Prior art keywords
contents data
information
character
character information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/827,629
Inventor
Yoichiro Sako
Mitsuru Toriyama
Tatsuya Inokuchi
Yoshimasa Utsumi
Kaoru Kijima
Kazuko Sakurai
Takashi Kihara
Shunsuke Eurukawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2002043660A external-priority patent/JP2003242289A/en
Priority claimed from JP2002053502A external-priority patent/JP4003482B2/en
Application filed by Sony Corp filed Critical Sony Corp
Priority to US11/827,629 priority Critical patent/US20070254737A1/en
Publication of US20070254737A1 publication Critical patent/US20070254737A1/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: UTSUMI, YOSHIMASA, SAKURAI, KAZUKO, KIJIMA, KAORU, FURUKAWA, SHUNSUKE, KIHARA, TAKASHI, INOKUCHI, TATSUYA, SAKO, YOICHIRO, TORIYAMA, MITSURU
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/792Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for payment purposes, e.g. monthly subscriptions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/71Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8058Virtual breeding, e.g. tamagotchi

Definitions

  • the present invention relates to a contents data processing method and a contents data processing apparatus. More particularly, the present invention relates to a contents data processing method and a contents data processing apparatus in which contents data is processed using character information.
  • character bringing-up games in which users bring up virtual characters and enjoy the growing process of the characters, have been developed in various forms.
  • portable game machines in which users bring up characters displayed on a liquid crystal panel by repeating breeding operations such as feeding and exercising the characters
  • application software for personal computers which allows users to dialogue with characters to be brought up.
  • Japanese Laid Open Patent No. 11-231880 discloses an information distributing system in which a character image displayed on an information terminal device is brought up depending on download history of music, e.g., karaoke songs, from an information distributing apparatus to the information terminal device.
  • music e.g., karaoke songs
  • the above-described portable game machines and application software lay main emphasis on bringing-up of characters so that users are interested in the process of bringing up characters.
  • the above-described information distributing system increases entertainingness by adding a factor of growth of characters to playing of music.
  • a contents data processing method comprises the steps of, at the time of processing contents data, reproducing character information changed with processing of the contents data, and selectively changing processing of the contents data in accordance with the reproduced character information.
  • a contents data processing apparatus including a storing unit, a reproducing unit, and a processing unit.
  • the storing unit stores character information.
  • the reproducing unit reproduces character information read out of the storing unit.
  • the processing unit processes supplied contents data.
  • the processing unit selectively changes processing of the contents data in accordance with the character information reproduced by the reproducing unit.
  • a contents data processing apparatus including a reproducing unit and a processing unit.
  • the reproducing unit reproduces character information associated with supplied storing unit.
  • the processing unit processes the supplied contents data.
  • the processing unit selectively changes processing of the contents data in accordance with the character information reproduced by the reproducing unit.
  • a contents data processing apparatus including a creating unit and a processing unit.
  • the creating unit creates character information from information associated with supplied contents data.
  • the processing unit processes the supplied contents data.
  • the processing unit selectively changes processing of the contents data in accordance with the character information created by the creating unit.
  • FIG. 1 is a schematic block diagram showing a first configuration example of a contents data processing apparatus according to a first embodiment of the present invention
  • FIG. 2 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 1 ;
  • FIG. 3 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the first embodiment of the present invention
  • FIG. 4 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 3 ;
  • FIG. 5 is a flowchart for explaining one example of a billing process
  • FIG. 6 is a schematic block diagram showing a first configuration example of a contents data processing apparatus according to a second embodiment of the present invention.
  • FIG. 7 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 6 ;
  • FIG. 8 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the second embodiment of the present invention.
  • FIG. 9 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 8 ;
  • FIG. 10 is a schematic block diagram showing a third configuration example of a contents data processing apparatus according to the second embodiment of the present invention.
  • FIG. 11 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 10 ;
  • FIG. 12 is a schematic block diagram showing a fourth configuration example of a contents data processing apparatus according to the second embodiment of the present invention.
  • FIG. 13 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 12 ;
  • FIG. 14 is a schematic block diagram showing a fifth configuration example of a contents data processing apparatus according to the second embodiment of the present invention.
  • FIG. 15 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 14 ;
  • FIG. 16 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a third embodiment of the present invention.
  • FIG. 17 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 16 ;
  • FIG. 18 is a flowchart for explaining one example of a step of creating character information in the flowchart of FIG. 17 ;
  • FIG. 19 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a fourth embodiment of the present invention.
  • FIG. 20 is a flowchart for explaining one example of a process of reproducing contents data executed in the contents data processing apparatus of FIG. 19 ;
  • FIG. 21 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a fifth embodiment of the present invention.
  • a contents data processing apparatus has two configuration examples, which are explained in sequence in the following description.
  • FIG. 1 is a schematic block diagram showing a first configuration example of the contents data processing apparatus according to the first embodiment of the present invention.
  • a contents data processing apparatus 100 shown in FIG. 1 comprises a character information storing unit 1 , a character information reproducing unit 2 , a processing unit 3 , a character information updating unit 4 , a contents data input unit 5 , and a user interface (I/F) unit 6 .
  • the character information storing unit 1 stores information regarding a character brought up with processing of contents data (hereinafter referred to simply as “character information”).
  • the character information contains, for example, information representing a temper, a growth process, and other nature of each character, specifically degrees of temper, growth, etc., in numerical values, and information indicating the types of characters, specifically the kinds of persons, animals, etc.
  • information reproduced in the character information reproducing unit 2 described later, e.g., information of images and voices, is also stored in the character information storing unit 1 .
  • the character information reproducing unit 2 reproduces information depending on the character information stored in the character information storing unit 1 .
  • the reproducing unit 2 reproduces, from among plural pieces of image and voice information stored in the character information storing unit 1 beforehand, information selected depending on the information indicating the nature and type of the character.
  • the reproducing unit 2 may additionally process the image and voice of the character. In other words, the reproducing unit 2 may additionally execute various processes such as changing the shape, hue, brightness and action of the character image, the loudness and tone of the character voice, and the number of characters.
  • the character information reproducing unit 2 may execute a process of combining a reproduction result of the processing unit 3 and a reproduction result of the character information with each other.
  • the processing unit 3 executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5 . On this occasion, the quality of the processing executed by the processing unit 3 is changed depending on the character information stored in the character information storing unit 1 .
  • the processing unit 3 comprises a contents data recording section 31 and a contents data reproducing section 32 , as shown in FIG. 1 , the quality in recording and reproducing the contents data is changed depending on the character information.
  • the contents data recording section 31 includes a storage device using a storage medium, such as a hard disk or a semiconductor memory, and records contents data inputted from the contents data input unit 5 in the storage device with the quality depending on the character information.
  • a storage medium such as a hard disk or a semiconductor memory
  • the contents data recording section 31 may not include a storage device therein.
  • the inputted contents data may be recorded in a storage device accessible via a wireless or wired communication line.
  • the quality in recording of the contents data to be changed in the processing unit 3 depending on the character information contains, for example, image quality of image data included in the contents data, sound quality of voice data therein, the number of voice channels (stereo/monaural, etc.), a data compression method and rate, etc.
  • the effective number of times and the effective period, at and during which the contents data recorded in the processing unit 3 is reproducible may be set and changed depending on the character information. In this case, the contents data having exceeded the set effective number of times or the set effective period is erased or made unreadable.
  • the processing unit 3 may just permit or prohibit the recording of the contents data depending on the character information.
  • the contents data reproducing section 32 includes, for example, an image reproducing device, such as a display, for reproducing image information, and a voice reproducing device, such as a speaker, for reproducing voice information.
  • the contents data reproducing section 32 reproduces image information and voice information, which are contained in the contents data inputted from the contents data input unit 5 or the contents data read out of the contents data recording section 31 , with the quality depending on the character information.
  • the contents data reproducing section 32 may include neither image reproducing device nor voice reproducing device.
  • reproduced image and voice information may be outputted to the user interface unit 6 or any other suitable device so as to reproduce the contents data in the device at the output destination.
  • the quality in reproducing the contents data to be changed in the processing unit 3 depending on the character information contains, for example, image quality, sound quality, the number of voice channels, a data compression method and rate, etc.
  • the effective number of times and the effective period, at and during which the contents data recorded in the contents data recording section 31 of the processing unit 3 is reproducible may be set and changed depending on the character information. In this case, the contents data having exceeded the set effective number of times or the set effective period is erased or made unreadable.
  • the processing unit 3 may just permit or prohibit the reproduction of the contents data depending on the character information.
  • the processing unit 3 may change the processing executed on the supplemental contents data depending on the character information.
  • the processing unit 3 permits or prohibits the process of recording the supplemental contents data in the contents data recording section 31 and the process of reproducing the supplemental contents data in the contents data reproducing section 32 , or changes the quality in recording and reproduction of the supplemental contents data depending on the character information.
  • Examples of the supplemental contents data include words information, jacket photographs, profile information of artists, and liner notes, which are added to music contents data.
  • the supplemental contents data may be a coupon including various bonuses.
  • the character information updating unit 4 updates the character information stored in the character information storing unit 1 depending on the run status of the processing executed in the processing unit 3 .
  • the character information updating unit 4 updates the information regarding the nature and type of the character by increasing the degree of growth of the character whenever the contents data is recorded or reproduced, and by reducing the degree of growth of the character when the contents data is neither recorded nor reproduced.
  • the contents data input unit 5 is a block for inputting contents data to the processing unit 3 , and may comprise any suitable one of various devices, such as an information reader for reading contents data recorded on a storage medium, e.g., a memory card and a magneto-optic disk, and a communication device for accessing a device in which contents data is held, and then downloading the contents data.
  • a storage medium e.g., a memory card and a magneto-optic disk
  • a communication device for accessing a device in which contents data is held, and then downloading the contents data.
  • the user interface unit 6 transmits, to the processing unit 3 , an instruction given from the user through a predetermined operation performed by the user using a switch, a button, a mouse, a keyboard, a microphone, etc.
  • a display, a lamp, a speaker, etc. may also be used to output the processing result of the processing unit 3 to the user.
  • the character information stored in the character information storing unit 1 is reproduced in the character information reproducing unit 2 for displaying the progress in bringing-up of the character information to the user (step ST 101 ).
  • the processing unit 3 prompts the user to select a process through the user interface unit 6 , and the user selects one of first to third processes (step ST 102 ).
  • the selection result is inputted from the user interface unit 6 to the processing unit 3 .
  • a process of recording contents data inputted from the contents data input unit 5 in the contents data recording section 31 is executed.
  • step ST 103 it is determined whether recording of the contents data is permitted. If the recording is permitted, the contents data is inputted from the contents data input unit 5 to the processing unit 3 (step ST 104 ), and then recorded in the contents data recording section 31 (step ST 105 ). At this time, the quality in recording the contents data is set depending on the character information. For example, recording of supplemental contents data, such as words information, is permitted or prohibited in accordance with the character information, and the quality in recording the supplemental contents data is set depending on the character information.
  • step ST 104 the process of inputting the contents data (step ST 104 ) and the process of recording the contents data (step ST 105 ) are skipped.
  • step ST 106 it is determined whether reproduction of the contents data inputted from the contents data input unit 5 is permitted. If the reproduction is permitted, the contents data is inputted from the contents data input unit 5 to the processing unit 3 (step ST 107 ), and then reproduced in the contents data reproducing section 32 (step ST 108 ). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • step ST 106 determines whether the reproduction of the contents data is permitted. If it is determined in step ST 106 based on the current character information that the reproduction of the contents data is not permitted, the process of inputting the contents data (step ST 107 ) and the process of reproducing the contents data (step ST 108 ) are skipped.
  • a process of reading the contents data recorded in the contents data recording section 31 and reproducing it in the contents data reproducing section 32 is executed.
  • step ST 109 it is determined whether reproduction of the contents data recorded in the contents data recording section 31 is permitted. If the reproduction is permitted, desired contents data is read out of the contents data recording section 31 (step ST 110 ), and then reproduced in the contents data reproducing section 32 (step ST 111 ). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • step ST 109 if it is determined in step ST 109 based on the current character information that the reproduction of the contents data is not permitted, the process of inputting the contents data (step ST 110 ) and the process of reproducing the contents data (step ST 111 ) are skipped.
  • the character information stored in the character information storing unit 1 is updated depending on the run status of the processing executed on the contents data (step ST 112 ). For example, the character information is updated depending on the total number of times of runs of the contents data processing and the frequency of runs of the contents data processing during a certain period.
  • the character information is updated depending on the run status of the contents data processing, and the form of the character reproduced in the character information reproducing unit 2 is gradually changed.
  • details of the contents data processing executed in the processing unit 3 are also changed. For example, the process of recording the contents data, which has been disabled for the character in the initial state, becomes enabled with repeated reproduction of the contents data. As another example, the quality in reproducing the contents data is improved depending on the degree of growth of the character, or the reproduction of supplemental contents data is newly permitted.
  • FIG. 3 is a schematic block diagram showing the second configuration example of the contents data processing apparatus according to the first embodiment.
  • a contents data processing apparatus 100 a shown in FIG. 3 comprises a character information storing unit 1 , a character information reproducing unit 2 , a processing unit 3 a , a character information updating unit 4 , a contents data input unit 5 , and a user interface (I/F) unit 6 .
  • a character information storing unit 1 a character information reproducing unit 2 , a processing unit 3 a , a character information updating unit 4 , a contents data input unit 5 , and a user interface (I/F) unit 6 .
  • I/F user interface
  • the processing unit 3 a executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5 , and changes the quality in processing of the contents data depending on the character information stored in the character information storing unit 1 .
  • the processing unit 3 a limits the processing of the charged contents data. For example, the processing unit 3 a disables the processing of the charged contents data or limits details of the processing in comparison with the ordinary processing.
  • the processing unit 3 a stops a decrypting process to disable the processing of the encrypted contents data.
  • the above-described limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6 . More specifically, when predetermined payment information is inputted from the user interface unit 6 , the processing unit 3 a checks whether the inputted payment information satisfies a predetermined payment condition. If the predetermined payment condition is satisfied, the above-described limitation on the processing of the charged contents data is released so that the content data processing in response to a user's processing request inputted from the user interface unit 6 can be executed. For example, the processing unit 3 a decrypts the encrypted charged contents data, thereby enabling the charged contents data to be processed.
  • the payment condition used in the step of checking the payment information by the processing unit 3 a is changed depending on the character information stored in the character information storing unit 1 . In other words, the payment condition becomes more severe or moderate depending on the growth of a character.
  • the charge of the contents data may be changed depending on information regarding the total purchase charge of the contents data or information regarding the number of times of purchases of the contents data, which is contained in the character information.
  • the processing unit 3 a comprises, as shown in FIG. 3 , a billing section 33 in addition to a contents data recording section 31 and a contents data reproducing section 32 .
  • the billing section 33 determines whether the contents data inputted from the contents data input unit 5 is charged data, and then displays the determination result on the user interface unit 6 .
  • the billing section 33 checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 33 releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 31 and the contents data reproducing section 32 . If the inputted payment information does not satisfy the predetermined payment condition, the billing section 33 limits the processing of the charged contents data.
  • the billing section 33 prompts the user to input cash or any other equivalent (such as a prepaid card) through the user interface unit 6 , and then checks whether the inputted cash or the like is genuine and whether the amount of money is proper. In accordance with the check result, the billing section 33 enables the processes of recording and reproducing the contents data to be executed.
  • cash or any other equivalent such as a prepaid card
  • the billing section 33 may prompt the user to input the user's credit card number or ID information through the user interface unit 6 , and then refer to an authentication server or the like about whether the inputted information is proper. In accordance with the authentication result, the billing section 33 may permit the processes of recording and reproducing the contents data to be executed.
  • FIG. 4 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 100 a of FIG. 3 .
  • the same symbols as those in FIG. 2 denote steps in each of which similar processing is executed as in FIG. 2 .
  • a flowchart of FIG. 4 differs from the flowchart of FIG. 2 in that billing processes (step ST 114 and step ST 115 ) are inserted respectively between steps ST 104 and ST 105 and between steps ST 107 and ST 108 .
  • FIG. 5 is a flowchart for explaining one example of the billing process.
  • the billing section 33 first determines whether the contents data inputted from the contents data input unit 5 is charged data (step S 201 ). If the inputted contents data is free, the subsequent billing process is skipped.
  • step ST 202 If the inputted contents data is charged data, whether to purchase the contents data or not is selected based on user's judgment inputted from the user interface unit 6 (step ST 202 ). If the purchase of the contents data is selected in step ST 202 , predetermined payment information is inputted from the user interface unit 6 (step ST 203 ). Then, the billing section 33 checks whether the inputted payment information satisfies a predetermined payment condition (step ST 204 ).
  • the payment condition used in the above step is set depending on the character information stored in the character information storing unit 1 .
  • the payment condition becomes more severe or moderate depending on, for example, the growth of a character.
  • step ST 205 whether to release the limitation on processing of the contents data or not is selected. If the release of the limitation is selected, the billing section 33 releases the limitation on processing of the contents data (step ST 206 ). For example, a process of decrypting the encrypted contents data is executed.
  • step ST 202 If the user does not select the purchase of the contents data in step ST 202 , or if the release of the limitation on processing of the contents data is rejected in step ST 205 , the step of releasing the limitation on processing of the contents data (step ST 206 ) and the subsequent steps of processing the contents data (steps ST 105 and ST 108 ) are both skipped.
  • the process flow shifts to a step of updating the character (step ST 112 ).
  • the above-described contents data processing apparatus 100 a of FIG. 3 can provide similar advantages as those in the contents data processing apparatus 100 of FIG. 1 .
  • the payment condition in purchasing the charged contents data is changed depending on the character information, the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • the character information is stored in the contents data processing apparatus, i.e., information associated with the contents data processing apparatus.
  • the character information is associated with the contents data. Therefore, even when, for example, the same contents data is processed by the same contents data processing apparatus, different characters are reproduced if the character information associated with one content data differs from that associated with another content data.
  • a contents data processing apparatus has five configuration examples, which are explained in sequence in the following description.
  • FIG. 6 is a schematic block diagram showing a first configuration example of the contents data processing apparatus according to the second embodiment of the present invention.
  • a contents data processing apparatus 101 shown in FIG. 6 comprises a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 and a processing unit 7 .
  • a character information reproducing unit 2 a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 and a processing unit 7 .
  • I/F user interface
  • the processing unit 7 executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5 . On this occasion, the quality of the processing executed by the processing unit 7 is changed depending on the character information associated with the inputted contents data.
  • the processing unit 7 comprises a contents data recording section 71 and a contents data reproducing section 72 , as shown in FIG. 6 , the quality in recording and reproducing the contents data is changed depending on the character information associated with the contents data.
  • the contents data recording section 71 includes a storage device using a storage medium, such as a hard disk or a semiconductor memory, and records contents data inputted from the contents data input unit 5 in the storage device with the quality depending on the associated character information. At the same time as recording the contents data, the associated character information is also recorded together.
  • a storage medium such as a hard disk or a semiconductor memory
  • the contents data recording section 71 may not include a storage device therein.
  • the inputted contents data and character information may be recorded in a storage device accessible via a wireless or wired communication line.
  • the quality in recording of the contents data to be changed in the processing unit 7 depending on the character information contains, for example, image quality of image data included in the contents data, sound quality of voice data therein, the number of voice channels, a data compression method and rate, etc.
  • the effective period, during which the contents data recorded in the processing unit 7 is reproducible, may be set and changed depending on the character information.
  • the processing unit 7 may just permit or prohibit the recording of the contents data depending on the character information.
  • the contents data reproducing section 72 includes, for example, an image reproducing device, such as a display, for reproducing image information, and a voice reproducing device, such as a speaker, for reproducing voice information.
  • the contents data reproducing section 72 reproduces image information and voice information, which are contained in the contents data inputted from the contents data input unit 5 or in the contents data read out of the contents data recording section 71 , with the quality depending on the character information associated with the contents data.
  • the contents data reproducing section 72 may include neither image reproducing device nor voice reproducing device.
  • reproduced image and voice information may be outputted to the user interface unit 6 or any other suitable device so as to reproduce the contents data in the device at the output destination.
  • the quality in reproduction of the contents data to be changed in the processing unit 7 depending on the character information contains, for example, image quality, sound quality, the number of voice channels, a data compression method and rate, etc.
  • the effective number of times and the effective period, at and during which the contents data recorded in the contents data recording section 71 of the processing unit 7 is reproducible, may be set and changed depending on the character information.
  • the processing unit 7 may just permit or prohibit the reproduction of the contents data depending on the character information.
  • the processing unit 7 may change the processing executed on the supplemental contents data depending on the character information associated with the contents data.
  • the processing unit 7 permits or prohibits the process of recording the supplemental contents data in the contents data recording section 71 and the process of reproducing the supplemental contents data in the contents data reproducing section 72 , or changes the quality in recording and reproduction of the supplemental contents data depending on the character information associated with the contents data.
  • the processing unit 7 prompts the user to select a process through user interface unit 6 . Then, the user selects one of first to third processes and the selection result is inputted from the user interface unit 6 to the processing unit 7 (step ST 301 ).
  • a process of recording contents data inputted from the contents data input unit 5 in the contents data recording section 71 is executed.
  • contents data is inputted from the contents data input unit 5 (step ST 302 ), and character information associated with the inputted contents data is reproduced in the character information reproducing unit 2 (step ST 303 ).
  • step ST 304 it is determined whether recording of the contents data is permitted. If the recording is permitted, the inputted contents data is recorded in the contents data recording section 71 (step ST 305 ). At this time, the quality in recording the contents data is set depending on the character information. For example, recording of supplemental contents data, such as words information, is permitted or prohibited in accordance with the character information, and the quality in recording the supplemental contents data is set depending on the character information.
  • step ST 304 determines whether the recording of the contents data is permitted. If it is determined in step ST 304 based on the character information associated with the inputted contents data that the recording of the contents data is not permitted, the process of recording the contents data (step ST 305 ) is skipped.
  • contents data is inputted from the contents data input unit 5 (step ST 306 ), and character information associated with the inputted contents data is reproduced in the character information reproducing unit 2 (step ST 307 ).
  • step ST 308 it is determined whether reproduction of the contents data is permitted. If the reproduction is permitted, the inputted contents data is reproduced in the contents data reproducing section 72 (step ST 309 ). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • step ST 308 if it is determined in step ST 308 based on the character information associated with the inputted contents data that the reproduction of the contents data is not permitted, the process of reproducing the contents data (step ST 309 ) is skipped.
  • a process of reading the contents data recorded in the contents data recording section 71 and reproducing it in the contents data reproducing section 72 is executed.
  • desired contents data is read out of the contents data recording section 71 (step ST 310 ), and character information associated with the read-out contents data is reproduced (step ST 311 ).
  • step ST 312 it is determined whether reproduction of the contents data is permitted. If the reproduction is permitted, the read-out contents data is reproduced in the contents data reproducing section 72 (step ST 313 ). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • step ST 312 if it is determined in step ST 312 based on the character information associated with the read-out contents data that the reproduction of the contents data is not permitted, the process of reproducing the contents data (step ST 313 ) is skipped.
  • the character information associated with the contents data is updated in the stage before the character information is supplied to the contents data processing apparatus 101 , and the character information is not changed inside the contents data processing apparatus 101 .
  • character information associated with the downloaded contents data is updated in the contents data supply apparatus.
  • character information depending on the result of another character bringing-up game, which has been played by a user is associated with contents data when the user downloads the contents data from the contents data supply apparatus.
  • FIG. 8 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the second embodiment.
  • a contents data processing apparatus 101 a shown in FIG. 8 comprises a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 , and a processing unit 7 a .
  • a character information reproducing unit 2 a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 , and a processing unit 7 a .
  • I/F user interface
  • the processing unit 7 a executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5 , and changes the quality in processing of the contents data depending on the character information associated with the inputted contents data.
  • the processing unit 7 a limits the processing of the charged contents data. For example, the processing unit 7 a disables the processing of the charged contents data or limits details of the processing in comparison with the ordinary processing.
  • the processing unit 7 a stops a decrypting process to disable the processing of the encrypted contents data.
  • the above-described limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6 . More specifically, when predetermined payment information is inputted from the user interface unit 6 , the processing unit 7 a checks whether the inputted payment information satisfies a predetermined payment condition. If the predetermined payment condition is satisfied, the above-described limitation on the processing of the charged contents data is released so that the content data processing in response to a user's processing request inputted from the user interface unit 6 can be executed. For example, the processing unit 7 a decrypts the encrypted charged contents data, thereby enabling the charged contents data to be processed.
  • the payment condition used in the step of checking the payment information by the processing unit 7 a is changed depending on the character information associated with the content data. In other words, the payment condition becomes more severe or moderate depending on the growth of a character associated with the content data.
  • the processing unit 7 a comprises, as shown in FIG. 8 , a billing section 73 in addition to a contents data recording section 71 and a contents data reproducing section 72 .
  • the billing section 73 has the same function as the billing section 33 , shown in FIG. 3 , except for that the payment condition for the charged content data is changed depending on the character information associated with the charged content data.
  • the billing section 73 determines whether the contents data inputted from the contents data input unit 5 is charged data, and then displays the determination result on the user interface unit 6 .
  • the billing section 73 checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 73 releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 71 and the contents data reproducing section 72 . If the inputted payment information does not satisfy the predetermined payment condition, the billing section 73 limits the processing of the charged contents data.
  • FIG. 9 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 a of FIG. 8 .
  • the same symbols as those in FIG. 7 denote steps in each of which similar processing is executed as in FIG. 7 .
  • a flowchart of FIG. 9 differs from the flowchart of FIG. 7 in that billing processes (step ST 315 and step ST 316 ) are inserted respectively between steps ST 304 and ST 305 and between steps ST 308 and ST 309 .
  • Each of those billing processes is substantially the same as the above-described process executed in accordance with the flowchart of FIG. 5 except for that the payment condition as a basis for checking the payment information in step ST 204 is changed depending on the character information associated with the contents data.
  • the above-described contents data processing apparatus 101 a of FIG. 8 can provide similar advantages as those in the contents data processing apparatus 101 of FIG. 6 .
  • the payment condition in purchasing the charged contents data is changed depending on the character information, the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • FIG. 10 is a schematic block diagram showing a third configuration example of a contents data processing apparatus according to the second embodiment.
  • a contents data processing apparatus 101 b shown in FIG. 10 comprises a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 , and a processing unit 7 b .
  • a character information reproducing unit 2 a character information reproducing unit 2 , a contents data input unit 5 , a user interface (I/F) unit 6 , and a processing unit 7 b .
  • I/F user interface
  • the processing unit 7 b executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5 , and changes the quality in processing of the contents data depending on the character information associated with the inputted contents data.
  • the processing unit 7 b updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data.
  • the character information updated by the processing unit 7 b is recorded in a contents data recording section 71 in association with the contents data.
  • the processing unit 7 b comprises, as shown in FIG. 10 , a character information updating section 74 in addition to a contents data recording section 71 and a contents data reproducing section 72 .
  • the character information updating section 74 updates the character information associated with the contents data, which is to be processed, depending on the run status of the first to third processes and then records the updated character information in the contents data recording section 71 .
  • the updating unit 74 increases the degree of growth of the character depending on the total number of times of runs of those processes, or reduces the degree of growth of the character if the frequency of runs of those processes during a certain period exceeds below a certain level.
  • FIG. 11 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 b of FIG. 10 .
  • the same symbols as those in FIG. 7 denote steps in each of which similar processing is executed as in FIG. 7 .
  • a flowchart of FIG. 11 differs from the flowchart of FIG. 7 in that a process of updating the character (step ST 317 ) is inserted after the first to third processes.
  • the character information associated with the contents data is updated in step ST 317 depending on the run status of the processing executed on the contents data, and the updated character information is recorded in the contents data recording section 71 in association with the contents data.
  • the character information associated with the contents data is updated depending on the run status of the contents data processing, and the form of the character reproduced in the character information reproducing section 72 is gradually changed.
  • details of the contents data processing executed in the processing unit 7 b are also changed.
  • the process of recording the contents data, which has been disabled for the character in the initial state becomes enabled with repeated reproduction of the contents data.
  • the quality in reproducing the contents data is improved depending on the degree of growth of the character, or the reproduction of supplemental contents data is newly permitted.
  • FIG. 12 is a schematic block diagram showing a fourth configuration example of a contents data processing apparatus according to the second embodiment.
  • a contents data processing apparatus 101 c shown in FIG. 12 comprises a character information reproducing unit 2 , a user interface (I/F) unit 6 , a processing unit 7 c , and a communication unit 8 .
  • I/F user interface
  • FIG. 12 the same symbols as those in FIG. 10 denote the same components as those in FIG. 10 .
  • the processing unit 7 c has basically similar functions to those of the processing unit 7 b shown in FIG. 10 . More specifically, the processing unit 7 c executes predetermined processing designated by a user through the user interface unit 6 on contents data received by the communication unit 8 , and changes the quality in processing of the contents data depending on the character information associated with the received contents data. The processing unit 7 c updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data, and records the updated character information in association with the contents data.
  • the processing unit 7 c executes a process of selecting desired one of the contents data recorded in a contents data recording section 71 in response to a user's instruction entered from the user interface unit 6 , and a process of transmitting, via the communication unit 8 (described below), the selected contents data to other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses.
  • the communication unit 8 executes a process of exchanging contents data and character information with other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses.
  • Any suitable communication method can be employed in the communication unit 8 .
  • wireless or wired communication is usable as required.
  • the communication may be performed via a network such as the Internet.
  • FIG. 13 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 c of FIG. 12 .
  • the same symbols as those in FIG. 11 denote steps in each of which similar processing is executed as in FIG. 11 .
  • a flowchart of FIG. 13 differs from the flowchart of FIG. 11 in that a fourth process is added in a step of selecting a process (step ST 301 a ).
  • step ST 301 a in response to a user's instruction entered from the user interface unit 6 , desired contents data corresponding to the user's instruction is read out of the contents data recorded in the contents data recording section 71 (step ST 318 ).
  • the read-out contents data is transmitted from the communication unit 8 to other contents data processing apparatuses or the contents data supply server (step ST 319 ).
  • a process of updating the character is executed in step ST 317 .
  • the above-described contents data processing apparatus 101 c of FIG. 12 can provide similar advantages as those in the contents data processing apparatus 101 b of FIG. 10 .
  • the contents data processing apparatus 101 c can exchange or share contents data with the other contents data processing apparatuses or the contents data supply server. Therefore, since a user can bring up a character in cooperation with other users or exchange a character with other users, it is possible to give the user further increased amusingness.
  • FIG. 14 is a schematic block diagram showing a fifth configuration example of a contents data processing apparatus according to the second embodiment.
  • a contents data processing apparatus 101 d shown in FIG. 14 comprises a character information reproducing unit 2 , a user interface (I/F) unit 6 , a processing unit 7 d , and a communication unit 8 .
  • I/F user interface
  • FIG. 14 the same symbols as those in FIG. 12 denote the same components as those in FIG. 12 .
  • the processing unit 7 d has basically similar functions to those of the processing unit 7 c shown in FIG. 12 . More specifically, the processing unit 7 d executes predetermined processing designated by a user through the user interface unit 6 on contents data received by the communication unit 8 , and changes the quality in processing of the contents data depending on the character information associated with the received contents data. The processing unit 7 d updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data.
  • the processing unit 7 d selects desired one of the contents data recorded in a contents data recording section 71 in response to a user's instruction entered from the user interface unit 6 , and then transmits the selected contents data from the communication unit 8 to other contents data processing apparatuses or a contents data supply server.
  • the processing unit 7 d also has a similar function to that of the processing unit 7 a shown in FIG. 8 . More specifically, when the contents data received by the communication unit 8 is charged data, the processing unit 7 d limits the processing of the charged contents data. The limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6 . The payment condition in the step of checking the payment information is changed depending on the character information associated with the content data.
  • the processing unit 7 d executes a process of limiting the use of the charged contents data, e.g., an encrypting process, as required.
  • the processing unit 7 d comprises, as shown in FIG. 14 , a billing section 73 a in addition to a contents data recording section 71 , a contents data reproducing section 72 , and a character information updating section 74 .
  • the billing section 73 a has the same function as the billing section 73 , shown in FIG. 8 , except for that the predetermined process of limiting the use of the charged contents data transmitted from the communication unit 8 .
  • the billing section 73 a determines whether the contents data received by the communication unit 8 is charged data, and then displays the determination result on the user interface unit 6 .
  • the billing section 73 a checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 73 a releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 71 and the contents data reproducing section 72 . If the inputted payment information does not satisfy the predetermined payment condition, the billing section 73 a limits the processing of the charged contents data.
  • the billing section 73 a executes a process of limiting the use of the charged contents data, e.g., an encrypting process, as required.
  • the character information recorded in the contents data recording section 71 together with the contents data is updated in the character information updating section 74 depending on the run status of the processing executed on the contents data (such as the number of times of processing runs and the frequency of processing runs).
  • FIG. 15 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 d of FIG. 14 .
  • the same symbols as those in FIG. 13 denote steps in each of which similar processing is executed as in FIG. 13 .
  • a flowchart of FIG. 15 differs from the flowchart of FIG. 13 in that billing processes (step ST 315 and step ST 316 ) are inserted respectively between steps ST 304 and ST 305 and between steps ST 308 and ST 309 .
  • Each of those billing processes is substantially the same as the above-described process executed in accordance with the flowchart of FIG. 5 except for that the payment condition as a basis for checking the payment information in step ST 204 is changed depending on the character information associated with the contents data.
  • the flowchart of FIG. 15 also differs from the flowchart of FIG. 13 in that a process of limiting the use of the contents data (step ST 320 ) is inserted between a process of reading the contents data (step ST 318 ) and a process of transmitting the contents data (step ST 319 ) in a fourth process.
  • the process of limiting the use of the contents data (e.g., encrypting process) is executed, as required, when the transmitted contents data is charged data.
  • the above-described contents data processing apparatus 101 d of FIG. 14 can provide similar advantages as those in the contents data processing apparatus 101 c of FIG. 12 .
  • the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • a part or the whole of the configuration of the contents data processing apparatuses described above in the first and second embodiments can be realized using a processor, such as a computer, which executes processing in accordance with a program.
  • the program may be stored in a storage device such as a hard disk and a semiconductor memory, or a storage medium such as a magnetic disk and a magneto-optical disk, and then read by the processor to execute the processing, as an occasion requires.
  • the program may be stored in a server capable of communicating via a wired or wireless communication means, and then downloaded in the processor to execute the processing, as an occasion requires.
  • the character information associated with the contents data may be only information indicating character properties, such as the nature and type of each character, or may contain image information and voice information reproducible in the character information reproducing section.
  • FIG. 16 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a third embodiment of the present invention.
  • a contents data processing apparatus 200 shown in FIG. 16 comprises a processing unit 201 , a recording unit 202 , a reproducing unit 203 , a user interface (I/F) unit 204 , a character information creating unit 205 , and a character memory 206 .
  • the processing unit 201 executes a process of reproducing or recording contents data in response to a user's instruction entered from the user interface unit 204 .
  • the processing unit 201 reads contents data, which has been selected in response to the user's instruction, out of the contents data recorded in the recording unit 202 , and then reproduces the read-out contents data depending in the reproducing unit 3 .
  • character information to be created corresponding to the read-out contents data is also reproduced in the character information creating unit 205 (described later in more detail) together with the read-out contents data.
  • the content and the character may be displayed on the same screen in a superimposed relation, or the character may be displayed to appear on the screen before or after reproduction of the content.
  • the reproducing unit 203 includes a plurality of displays, the content and the character may be displayed on respective different screens independently of each other.
  • the processing unit 201 When the process of recording contents data is executed, the processing unit 201 records character information, which has been created corresponding to the contents data, in the recording unit 202 in association with the contents data.
  • the processing unit 201 may join the contents data and the character information into a single file and record the file in the recording unit 202 , or may record the contents data and the character information in separate files.
  • the recording unit 202 records the contents data and the character information, which are supplied from the processing unit 201 , under write control of the processing unit 201 , and outputs the recorded contents data to the processing unit 201 under read control of the processing unit 201 .
  • the recording unit 202 may be constituted using, for example, a stationary storage device such as a hard disk, or a combination of a removable storage medium such as a magneto-optical disk or a semiconductor memory card, a reader, and a write device.
  • the reproducing unit 203 reproduces the contents data and the character information under control of the processing unit 201 .
  • the reproducing unit 203 includes, for example, a display for reproducing image information and a speaker for reproducing voice information, and reproduces images and voices corresponding to the contents data and the character information by those devices.
  • the user interface unit 204 includes input devices, such as a switch, a button, a mouse, a keyboard and a microphone, and transmits, to the processing unit 201 , an instruction given from the user who performs a predetermined operation using those input devices.
  • the user interface unit 204 may output the processing result of the processing unit 201 to the user using output devices such as a display, a lamp and a speaker.
  • the character information creating unit 205 creates character information depending on specific information associated with the contents data that is recorded or reproduced in the processing unit 201 .
  • the character information created in the character information creating unit 205 contains information for reproducing images, voices, etc., of a virtual character corresponding to the content in the reproducing unit 203 .
  • the character information may be created in the creating unit 205 depending on the price information.
  • the character information may be created in the creating unit 205 depending on information that is associated with the content data and indicates the type of the relevant content data.
  • the character information depending on the determination result is created in the creating unit 205 .
  • music content based on information of music genre associated with music content data (or other related information), it is determined that the genre of the music content corresponds to which one of predetermined genres (such as rock, jazz, Japanese popular songs, and classics).
  • predetermined genres such as rock, jazz, Japanese popular songs, and classics.
  • the character information depending on the determination result is then created in the creating unit 205 .
  • character information such as dressing a character in kimono (Japanese traditional clothe) is created in the creating unit 205 .
  • the character information may be created depending on information that is associated with the content data and indicates the number of times at which the relevant content data has been copied in the past.
  • the character information may be newly created depending on the character information that is associated with the content data and has been created before.
  • the character information is not changed regarding unchangeable attributes such as the price and the type, while the character information is changed depending on changeable attributes such as the number of times of copying or reproduction of the content data.
  • the creating unit 205 may create character information such that information indicating the number of times of copying or reproduction of the contents data contained in the character information is updated each time the contents data is copied or reproduced, and the image and voice of a character is changed when the number of times of copying or reproduction reaches a predetermined value.
  • the character information created in the creating unit 205 may contain ID information for identifying the owner of a character.
  • ID information for identifying the owner of a character.
  • the character information for the user may be created in the creating unit 205 depending on the character information for the other person.
  • the character information contained in the character information associated with the contents data is identical to the user's own ID information. If not identical, character information for the user is newly created depending on information regarding the type, nature, state, etc. of a character contained in the character information associated with the contents data. When the character for the other person is child, the character information may be created such that the character for the user is provided as a child of almost the same age.
  • the character information containing another person's ID information includes a message issued from a character contained therein, the character information causing the character for the user to reply the message. For example, when the character for the other person issues a message “How many years old are you?”, the character information may be created such that the character for the user replies “I am five years old”.
  • the character information creating unit 205 has the functions described above.
  • the character memory 206 stores information necessary for a character to grow and change.
  • the information stored in the character memory 206 is read and used, as required, when the character information is created in the character information creating unit 205 .
  • the character memory 206 stores information necessary for reproducing character images and voices in the reproducing unit 3 .
  • the character memory 206 may also store the ID information for identifying the user of the contents data processing apparatus.
  • the character memory 206 may be constituted using, for example, a stationary storage device such as a hard disk, or a combination of a removable storage medium, such as a magneto-optical disk or a semiconductor memory card, and a reader.
  • a stationary storage device such as a hard disk
  • a removable storage medium such as a magneto-optical disk or a semiconductor memory card
  • FIG. 17 is a flowchart for explaining one example of the contents data processing executed in the contents data processing apparatus 200 of FIG. 16 .
  • Step ST 401
  • Contents data to be processed is inputted to the processing unit 201 .
  • the contents data instructed by the user to be processed is selected from among the contents data stored in the recording unit 202 , and then read into the processing unit 201 .
  • Step ST 402
  • Character information S 5 is created depending on specific information S 1 associated with the inputted contents data.
  • image and voice information of a character (e.g., information regarding the face, clothing and voice of a character, and messages) is read our of the character memory 206 and processed depending on information regarding the price, the type, the number of times of copying, the number of times of reproduction, etc. of the contents data, whereby the character information is created in the creating section 205 .
  • character information for the user is created in the creating unit 205 depending on the character information for the other person.
  • FIG. 18 is a flowchart for explaining one example of the more detailed process in the step ST 402 of creating the character information.
  • step ST 4021 it is first determined whether the character information created before is associated with the inputted contents data. If it is determined in step ST 4021 that the character information created before is not associated with the inputted contents data, new character information is created (step ST 4022 ).
  • step ST 4021 If it is determined in step ST 4021 that the character information created before is associated with the inputted contents data, it is then determined whether the user's own ID information is contained in the character information created before (step ST 4023 ). If it is determined that the another person's ID information is contained in the character information created before, character information for the user is newly created depending on information regarding the type, nature, state, etc. of a character contained in the character information for the other person (step ST 4024 ).
  • step ST 4023 If it is determined in step ST 4023 that the user's own ID information is contained the character information associated with the inputted contents data, the character information is updated as required (step ST 4025 ).
  • the updating process in step ST 4025 is executed, for example, by updating the character information when the number of times of copying or reproduction of the contents data reaches a predetermined value.
  • This step ST 403 executes a process of recording or reproducing the inputted contents data and the character information created corresponding to the former.
  • the updated or newly created character information is recorded in the recording unit 202 in association with the contents data.
  • the updated or newly created character information or the character information read out of the recording unit 202 in association with the contents data is reproduced together with the contents data.
  • contents data processing apparatus 200 shown in FIG. 16 since a game factor of bringing up a character associated with the contents data is added to the ordinary fun in reproducing the contents data, users can feel higher amusingness with processing of the contents data.
  • the brought-up character is not a character associated with the contents data processing apparatus, but a character that moves and grows with the contents data. Therefore, each time different contents data is reproduced, users can be given with the opportunities of enjoying characters in different grown-up states. As a result, users can be more surely kept from becoming weary in bringing up characters in comparison with the case of bringing up only one character.
  • the character for the other person is a teacher type
  • the character for the user is set to a pupil type correspondingly and displayed together with the teacher type character.
  • the character for the other person issues a message “How much is it?”
  • the character for the user is set to issue a message “You may have it for nothing” correspondingly and displayed together with the character for the other person.
  • a user can make communications with another user, who records or reproduces contents data, via a virtual character reproduced in the reproducing unit 203 , the user can be give with further increased amusingness.
  • a configuration of a contents data processing apparatus according to a fourth embodiment of the present invention will be described below with reference to FIGS. 19 and 20 .
  • character information is created depending on time information and/or position information.
  • FIG. 19 is a schematic block diagram showing the configuration of the contents data processing apparatus according to the fourth embodiment of the present invention.
  • a contents data processing apparatus 200 a shown in FIG. 19 comprises a processing unit 201 , a recording unit 202 , a reproducing unit 203 , a user interface (I/F) unit 204 , a character information creating unit 205 a , a character memory 206 , a time information producing unit 207 , and a position information producing unit 208 .
  • a processing unit 201 a recording unit 202 , a reproducing unit 203 , a user interface (I/F) unit 204 , a character information creating unit 205 a , a character memory 206 , a time information producing unit 207 , and a position information producing unit 208 .
  • the time information producing unit 207 produces time information, such as information regarding the time of day, information regarding time zones (forenoon, afternoon, etc.) per day, information regarding the day of week, information regarding the month, and information regarding the season.
  • the position information producing unit 208 produces information regarding the geographical position of the contents data processing apparatus 200 a .
  • the producing unit 208 may produce position information by utilizing a mechanism of the GPS (Global Positioning System) for measuring the geographical position in accordance with a signal from a stationary satellite, etc.
  • GPS Global Positioning System
  • the character information creating unit 205 a has, in addition to the same function as that of the character information creating unit 205 shown in FIG. 16 , the function of creating character information depending on the time information produced in the time information producing unit 207 and the position information produced in the position information producing unit 208 .
  • the creating unit 205 a creates the character information dressing a character in a bath gown (informal kimono for summer wear).
  • the creating unit 205 a creates the character information dressing a character in an aloha shirt.
  • the form of a character created in the creating unit 205 a may be changed during the reproduction depending on the time information and/or the position information.
  • FIG. 20 is a flowchart for explaining one example of a process of reproducing contents data executed in the contents data processing apparatus 200 a of FIG. 19 . More specifically, FIG. 20 shows one example of the detailed process in step ST 403 in the flowchart of FIG. 17 .
  • step ST 4031 After the start of reproduction of the contents data (step ST 4031 ), it is determined whether the time of day indicated by the time information produced by the time information producing unit 207 reaches the predetermined time of day (step ST 4032 ). If it is determined in step ST 4032 that the indicated time of day reaches the predetermined time of day, the character information under the reproduction is changed depending on the predetermined time of day (step ST 4033 ). For example, when the time of day at which the contents data is being reproduced reaches midnight 12:00, the clothing of the character is changed to pajamas.
  • step ST 4034 it is determined whether the district, in which the contents data processing apparatus 200 a is located and which is indicated by the position information produced from the position information producing unit 208 , has changed. If it is determined that the district where the contents data processing apparatus 200 a is located has changed, the character information is also changed depending on the district to which the location of the contents data processing apparatus 200 a has changed (step ST 4035 ). For example, when it is determined that the contents data processing apparatus 200 a which is reproducing contents data regarding professional baseball has moved from one to another district, the mark of a baseball cap put on the character is changed to the mark representing the professional baseball team in the district to which the contents data processing apparatus 200 a has moved.
  • step ST 4034 After the above-described determination regarding the time information (step ST 4032 ) and the above-described determination regarding the district where the contents data processing apparatus 200 a is located (step ST 4034 ), it is determined whether the reproduction of the contents data is to be completed, and whether the end of reproduction of the contents data is instructed from the user interface unit 204 (step ST 4036 ). If it is determined that the reproduction of the contents data is to be completed, or that the end of reproduction of the contents data is instructed, the process of reproducing the contents data is brought into an end (step ST 4037 ).
  • step ST 4036 If it is determined in step ST 4036 that the reproduction of the contents data is not to be completed and that the end of reproduction of the contents data is not instructed, i.e., that the reproduction of the contents data is to be continued, the process flow returns to step ST 4032 to continue the process of reproducing the contents data.
  • the above-described contents data processing apparatus 200 a of FIG. 19 can provide similar advantages as those in the contents data processing apparatus 200 of FIG. 16 .
  • the character information can be changed depending on the time and location at and in which contents data is processed, it is possible to provide a variety variations in patterns for bringing up a character and to give users further increased amusingness.
  • a fifth embodiment of a contents data processing apparatus will be described below with reference to FIG. 21 .
  • FIG. 21 is a schematic block diagram showing a configuration of the contents data processing apparatus according to the fifth embodiment of the present invention.
  • a contents data processing apparatus 200 b shown in FIG. 21 comprises a processing unit 201 a , a recording unit 202 , a reproducing unit 203 , a user interface (I/F) unit 204 , a character information creating unit 205 , a character memory 206 , and a communication unit 209 .
  • a processing unit 201 a a recording unit 202 , a reproducing unit 203 , a user interface (I/F) unit 204 , a character information creating unit 205 , a character memory 206 , and a communication unit 209 .
  • I/F user interface
  • the communication unit 209 executes a process of exchanging contents data and character information with other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses.
  • Any suitable communication method can be employed in the communication unit 209 .
  • wired or wireless communication is usable as required.
  • the communication unit 209 may communicate via a network such as the Internet.
  • the processing unit 201 a has, in addition to the same function as that of the processing unit 201 shown in FIG. 16 , the function of selecting desired one of the contents data recorded in the recording unit 202 in response to a user's instruction entered from the user interface unit 204 , and then transmitting the selected contents data from the communication unit 209 to other contents data processing apparatuses or a contents data supply server for supplying contents data to the other contents data processing apparatuses.
  • the communication unit 209 may be controlled to execute a process of accessing the other contents data processing apparatuses or the contents data supply server and then downloading the contents data held in them as contents data to be processed, and a process of, in response to a download request from the other contents data processing apparatuses or the contents data supply server, supplying the contents data recorded in the recording unit 202 to them.
  • the above-described contents data processing apparatus 200 b of FIG. 21 can provide similar advantages as those in the contents data processing apparatus 200 of FIG. 16 .
  • the contents data processing apparatus 200 b can exchange or share contents data with the other contents data processing apparatuses or the contents data supply server. Therefore, since a user can bring up a character in cooperation with other users or exchange a character with other users, it is possible to give the user further increased amusingness.
  • a part or the whole of the configuration of the contents data processing apparatuses described above in the third to fifth embodiments can be realized using a processor, such as a computer, which executes processing in accordance with a program.
  • the program may be stored in a storage device such as a hard disk and a semiconductor memory, or a storage medium such as a magnetic disk and a magneto-optical disk, and then read by the processor to execute the processing, as an occasion requires.
  • the program may be stored in a server capable of communicating via a wired or wireless communication means, and then downloaded in the processor to execute the processing, as an occasion requires.
  • the character information recorded in the recording unit 202 in association with the contents data may contain direct information used for reproducing a character (e.g., image information and voice information of a character), or may contain indirect information for designating the reproduced form of a character (e.g., information of numbers made correspondent respective patterns of a character) instead of containing the direct information. In the latter case, since the data amount of the character information is reduced in comparison with that of the character information containing image information and voice information, the required recording capacity of the recording unit 202 can be held down.

Abstract

Contents data processing apparatus comprising a reproducing block for reproducing contents data from a recording medium, a character information generation block for generating character information based on first information accompanied with said contents data, and a character memory for storing second information regarding growth or change.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 10/364,495, filed on Feb. 11, 2003, the disclosure of which is incorporated herein by reference. application Ser. No. 10/364,495 claims priority from Japanese Patent Application No. P2002-043660, filed on Feb. 20, 2002 and Japanese Patent Application No. P2002-053502, filed on Feb. 28, 2002, the disclosures of which are hereby incorporated by reference herein.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a contents data processing method and a contents data processing apparatus. More particularly, the present invention relates to a contents data processing method and a contents data processing apparatus in which contents data is processed using character information.
  • 2. Description of the Related Art
  • Recently, character bringing-up games, in which users bring up virtual characters and enjoy the growing process of the characters, have been developed in various forms. For example, there are portable game machines in which users bring up characters displayed on a liquid crystal panel by repeating breeding operations such as feeding and exercising the characters, and application software for personal computers, which allows users to dialogue with characters to be brought up.
  • Japanese Laid Open Patent No. 11-231880, for example, discloses an information distributing system in which a character image displayed on an information terminal device is brought up depending on download history of music, e.g., karaoke songs, from an information distributing apparatus to the information terminal device.
  • The above-described portable game machines and application software lay main emphasis on bringing-up of characters so that users are interested in the process of bringing up characters. The above-described information distributing system increases entertainingness by adding a factor of growth of characters to playing of music.
  • However, as seen from the fact that a boom of the above-described portable game machines has been temporary, a period during which users become enthusiastic in “bringing-up characters” is not so long, and those games tend to lose popularity among users in a short period. Accordingly, a factor capable of keeping users from becoming weary soon is demanded in those games in addition to the factor of “bringing-up of characters”.
  • SUMMARY OF THE INVENTION
  • It is therefore an object of the present invention to provide a contents data processing method which resolves the above-mentioned problem.
  • It is another object of the present invention to provide a contents data processing apparatus which resolves the above-mentioned problem.
  • According to the present invention, there is provided a contents data processing method. The processing method comprises the steps of, at the time of processing contents data, reproducing character information changed with processing of the contents data, and selectively changing processing of the contents data in accordance with the reproduced character information.
  • According to the present invention, there is provided a contents data processing apparatus including a storing unit, a reproducing unit, and a processing unit. The storing unit stores character information. The reproducing unit reproduces character information read out of the storing unit. The processing unit processes supplied contents data. The processing unit selectively changes processing of the contents data in accordance with the character information reproduced by the reproducing unit.
  • According to the present invention, there is provided a contents data processing apparatus including a reproducing unit and a processing unit. The reproducing unit reproduces character information associated with supplied storing unit. The processing unit processes the supplied contents data. The processing unit selectively changes processing of the contents data in accordance with the character information reproduced by the reproducing unit.
  • According to the present invention, there is provided a contents data processing apparatus including a creating unit and a processing unit. The creating unit creates character information from information associated with supplied contents data. The processing unit processes the supplied contents data. The processing unit selectively changes processing of the contents data in accordance with the character information created by the creating unit.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic block diagram showing a first configuration example of a contents data processing apparatus according to a first embodiment of the present invention;
  • FIG. 2 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 1;
  • FIG. 3 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the first embodiment of the present invention;
  • FIG. 4 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 3;
  • FIG. 5 is a flowchart for explaining one example of a billing process;
  • FIG. 6 is a schematic block diagram showing a first configuration example of a contents data processing apparatus according to a second embodiment of the present invention;
  • FIG. 7 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 6;
  • FIG. 8 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the second embodiment of the present invention;
  • FIG. 9 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 8;
  • FIG. 10 is a schematic block diagram showing a third configuration example of a contents data processing apparatus according to the second embodiment of the present invention;
  • FIG. 11 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 10;
  • FIG. 12 is a schematic block diagram showing a fourth configuration example of a contents data processing apparatus according to the second embodiment of the present invention;
  • FIG. 13 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 12;
  • FIG. 14 is a schematic block diagram showing a fifth configuration example of a contents data processing apparatus according to the second embodiment of the present invention;
  • FIG. 15 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 14;
  • FIG. 16 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a third embodiment of the present invention;
  • FIG. 17 is a flowchart for explaining one example of contents data processing executed in the contents data processing apparatus of FIG. 16;
  • FIG. 18 is a flowchart for explaining one example of a step of creating character information in the flowchart of FIG. 17;
  • FIG. 19 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a fourth embodiment of the present invention;
  • FIG. 20 is a flowchart for explaining one example of a process of reproducing contents data executed in the contents data processing apparatus of FIG. 19; and
  • FIG. 21 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a fifth embodiment of the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A first embodiment of the present invention will be described below.
  • A contents data processing apparatus according to the first embodiment has two configuration examples, which are explained in sequence in the following description.
  • FIRST CONFIGURATION EXAMPLE
  • FIG. 1 is a schematic block diagram showing a first configuration example of the contents data processing apparatus according to the first embodiment of the present invention.
  • A contents data processing apparatus 100 shown in FIG. 1 comprises a character information storing unit 1, a character information reproducing unit 2, a processing unit 3, a character information updating unit 4, a contents data input unit 5, and a user interface (I/F) unit 6.
  • The character information storing unit 1 stores information regarding a character brought up with processing of contents data (hereinafter referred to simply as “character information”).
  • The character information contains, for example, information representing a temper, a growth process, and other nature of each character, specifically degrees of temper, growth, etc., in numerical values, and information indicating the types of characters, specifically the kinds of persons, animals, etc. In addition to such information indicating the nature and types of characters, information reproduced in the character information reproducing unit 2, described later, e.g., information of images and voices, is also stored in the character information storing unit 1.
  • The character information reproducing unit 2 reproduces information depending on the character information stored in the character information storing unit 1.
  • For example, the reproducing unit 2 reproduces, from among plural pieces of image and voice information stored in the character information storing unit 1 beforehand, information selected depending on the information indicating the nature and type of the character.
  • Depending on the information indicating the nature and type of the character, the reproducing unit 2 may additionally process the image and voice of the character. In other words, the reproducing unit 2 may additionally execute various processes such as changing the shape, hue, brightness and action of the character image, the loudness and tone of the character voice, and the number of characters.
  • When a contents data reproducing/output device, e.g., a display and a speaker, in the processing unit 3 is the same as a character information reproducing/output device, the character information reproducing unit 2 may execute a process of combining a reproduction result of the processing unit 3 and a reproduction result of the character information with each other.
  • The processing unit 3 executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5. On this occasion, the quality of the processing executed by the processing unit 3 is changed depending on the character information stored in the character information storing unit 1.
  • For example, in a configuration in which the processing unit 3 comprises a contents data recording section 31 and a contents data reproducing section 32, as shown in FIG. 1, the quality in recording and reproducing the contents data is changed depending on the character information.
  • The contents data recording section 31 includes a storage device using a storage medium, such as a hard disk or a semiconductor memory, and records contents data inputted from the contents data input unit 5 in the storage device with the quality depending on the character information.
  • Incidentally, the contents data recording section 31 may not include a storage device therein. In this case, the inputted contents data may be recorded in a storage device accessible via a wireless or wired communication line.
  • The quality in recording of the contents data to be changed in the processing unit 3 depending on the character information contains, for example, image quality of image data included in the contents data, sound quality of voice data therein, the number of voice channels (stereo/monaural, etc.), a data compression method and rate, etc.
  • The effective number of times and the effective period, at and during which the contents data recorded in the processing unit 3 is reproducible, may be set and changed depending on the character information. In this case, the contents data having exceeded the set effective number of times or the set effective period is erased or made unreadable.
  • More simply, the processing unit 3 may just permit or prohibit the recording of the contents data depending on the character information.
  • The contents data reproducing section 32 includes, for example, an image reproducing device, such as a display, for reproducing image information, and a voice reproducing device, such as a speaker, for reproducing voice information. The contents data reproducing section 32 reproduces image information and voice information, which are contained in the contents data inputted from the contents data input unit 5 or the contents data read out of the contents data recording section 31, with the quality depending on the character information.
  • Incidentally, the contents data reproducing section 32 may include neither image reproducing device nor voice reproducing device. In this case, reproduced image and voice information may be outputted to the user interface unit 6 or any other suitable device so as to reproduce the contents data in the device at the output destination.
  • As with the recording quality mentioned above, the quality in reproducing the contents data to be changed in the processing unit 3 depending on the character information contains, for example, image quality, sound quality, the number of voice channels, a data compression method and rate, etc.
  • The effective number of times and the effective period, at and during which the contents data recorded in the contents data recording section 31 of the processing unit 3 is reproducible, may be set and changed depending on the character information. In this case, the contents data having exceeded the set effective number of times or the set effective period is erased or made unreadable.
  • More simply, the processing unit 3 may just permit or prohibit the reproduction of the contents data depending on the character information.
  • Further, when there is contents data additionally supplemented with main contents data (hereinafter referred to as “supplemental contents data”), the processing unit 3 may change the processing executed on the supplemental contents data depending on the character information. In the configuration example of FIG. 1, for example, the processing unit 3 permits or prohibits the process of recording the supplemental contents data in the contents data recording section 31 and the process of reproducing the supplemental contents data in the contents data reproducing section 32, or changes the quality in recording and reproduction of the supplemental contents data depending on the character information.
  • Examples of the supplemental contents data include words information, jacket photographs, profile information of artists, and liner notes, which are added to music contents data. The supplemental contents data may be a coupon including various bonuses.
  • The character information updating unit 4 updates the character information stored in the character information storing unit 1 depending on the run status of the processing executed in the processing unit 3.
  • For example, the character information updating unit 4 updates the information regarding the nature and type of the character by increasing the degree of growth of the character whenever the contents data is recorded or reproduced, and by reducing the degree of growth of the character when the contents data is neither recorded nor reproduced.
  • The contents data input unit 5 is a block for inputting contents data to the processing unit 3, and may comprise any suitable one of various devices, such as an information reader for reading contents data recorded on a storage medium, e.g., a memory card and a magneto-optic disk, and a communication device for accessing a device in which contents data is held, and then downloading the contents data.
  • The user interface unit 6 transmits, to the processing unit 3, an instruction given from the user through a predetermined operation performed by the user using a switch, a button, a mouse, a keyboard, a microphone, etc. A display, a lamp, a speaker, etc. may also be used to output the processing result of the processing unit 3 to the user.
  • One example of contents data processing executed in the contents data processing apparatus 100 of FIG. 1 will be described below with reference to a flowchart of FIG. 2.
  • First, the character information stored in the character information storing unit 1 is reproduced in the character information reproducing unit 2 for displaying the progress in bringing-up of the character information to the user (step ST101). The processing unit 3 prompts the user to select a process through the user interface unit 6, and the user selects one of first to third processes (step ST102). The selection result is inputted from the user interface unit 6 to the processing unit 3.
  • First Process:
  • In a first process, a process of recording contents data inputted from the contents data input unit 5 in the contents data recording section 31 is executed.
  • Based on the character information currently stored in the character information storing unit 1, it is determined whether recording of the contents data is permitted (step ST103). If the recording is permitted, the contents data is inputted from the contents data input unit 5 to the processing unit 3 (step ST104), and then recorded in the contents data recording section 31 (step ST105). At this time, the quality in recording the contents data is set depending on the character information. For example, recording of supplemental contents data, such as words information, is permitted or prohibited in accordance with the character information, and the quality in recording the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined based on the current character information that the recording of the contents data is not permitted, the process of inputting the contents data (step ST104) and the process of recording the contents data (step ST105) are skipped.
  • Second Process:
  • In a second process, a process of reproducing contents data inputted from the contents data input unit 5 in the contents data reproducing section 32 is executed.
  • Based on the current character information, it is determined whether reproduction of the contents data inputted from the contents data input unit 5 is permitted (step ST106). If the reproduction is permitted, the contents data is inputted from the contents data input unit 5 to the processing unit 3 (step ST107), and then reproduced in the contents data reproducing section 32 (step ST108). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined in step ST106 based on the current character information that the reproduction of the contents data is not permitted, the process of inputting the contents data (step ST107) and the process of reproducing the contents data (step ST108) are skipped.
  • Third Process:
  • In a third process, a process of reading the contents data recorded in the contents data recording section 31 and reproducing it in the contents data reproducing section 32 is executed.
  • Based on the current character information, it is determined whether reproduction of the contents data recorded in the contents data recording section 31 is permitted (step ST109). If the reproduction is permitted, desired contents data is read out of the contents data recording section 31 (step ST110), and then reproduced in the contents data reproducing section 32 (step ST111). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined in step ST109 based on the current character information that the reproduction of the contents data is not permitted, the process of inputting the contents data (step ST110) and the process of reproducing the contents data (step ST111) are skipped.
  • After the execution of the above-described first to third processes, the character information stored in the character information storing unit 1 is updated depending on the run status of the processing executed on the contents data (step ST112). For example, the character information is updated depending on the total number of times of runs of the contents data processing and the frequency of runs of the contents data processing during a certain period.
  • Thus, as the processing of the contents data is repeated many times, the character information is updated depending on the run status of the contents data processing, and the form of the character reproduced in the character information reproducing unit 2 is gradually changed.
  • Responsive to the change of the character, details of the contents data processing executed in the processing unit 3 are also changed. For example, the process of recording the contents data, which has been disabled for the character in the initial state, becomes enabled with repeated reproduction of the contents data. As another example, the quality in reproducing the contents data is improved depending on the degree of growth of the character, or the reproduction of supplemental contents data is newly permitted.
  • With the contents data processing apparatus 100 of FIG. 1, therefore, since the fun of the character bringing-up game is added to the fun with ordinary processing of contents data, increased amusingness can be given to users in processing the contents data.
  • Users can enjoy not only the growth of a character, but also changes in processing of the contents data with the growth of the character. This keeps from the users from becoming weary soon unlike the character bringing-up game in which main emphasis is put on the growth of a character. For example, as the contents data is reproduced many times, users can enjoy the progress of growth of the character form. Further, users can feel satisfaction with an improvement in processing quality of the contents data, such as gradually improved image and sound quality of the reproduced contents data, a change from monaural to stereo sounds, or release from prohibition of copying of the contents data. Thus, since the fun in processing the contents data is combined with the fun of the character bringing-up game, it is possible to give the users increased amusingness as a result of the synergistic effect.
  • SECOND CONFIGURATION EXAMPLE
  • A second configuration example of a contents data processing apparatus according to the first embodiment will be described below.
  • FIG. 3 is a schematic block diagram showing the second configuration example of the contents data processing apparatus according to the first embodiment.
  • A contents data processing apparatus 100 a shown in FIG. 3 comprises a character information storing unit 1, a character information reproducing unit 2, a processing unit 3 a, a character information updating unit 4, a contents data input unit 5, and a user interface (I/F) unit 6. Note that components in FIG. 3 common to those in FIG. 1 are denoted by the same reference symbols, and a detailed description of those components is omitted here.
  • As with the processing unit 3 in FIG. 1, the processing unit 3 a executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5, and changes the quality in processing of the contents data depending on the character information stored in the character information storing unit 1.
  • When the inputted contents data is charged data, the processing unit 3 a limits the processing of the charged contents data. For example, the processing unit 3 a disables the processing of the charged contents data or limits details of the processing in comparison with the ordinary processing. When the contents data is encrypted, the processing unit 3 a stops a decrypting process to disable the processing of the encrypted contents data.
  • The above-described limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6. More specifically, when predetermined payment information is inputted from the user interface unit 6, the processing unit 3 a checks whether the inputted payment information satisfies a predetermined payment condition. If the predetermined payment condition is satisfied, the above-described limitation on the processing of the charged contents data is released so that the content data processing in response to a user's processing request inputted from the user interface unit 6 can be executed. For example, the processing unit 3 a decrypts the encrypted charged contents data, thereby enabling the charged contents data to be processed.
  • The payment condition used in the step of checking the payment information by the processing unit 3 a is changed depending on the character information stored in the character information storing unit 1. In other words, the payment condition becomes more severe or moderate depending on the growth of a character.
  • The charge of the contents data may be changed depending on information regarding the total purchase charge of the contents data or information regarding the number of times of purchases of the contents data, which is contained in the character information.
  • The processing unit 3 a comprises, as shown in FIG. 3, a billing section 33 in addition to a contents data recording section 31 and a contents data reproducing section 32.
  • The billing section 33 determines whether the contents data inputted from the contents data input unit 5 is charged data, and then displays the determination result on the user interface unit 6. When predetermined payment information corresponding to the display is inputted from the user interface unit 6, the billing section 33 checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 33 releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 31 and the contents data reproducing section 32. If the inputted payment information does not satisfy the predetermined payment condition, the billing section 33 limits the processing of the charged contents data.
  • For example, the billing section 33 prompts the user to input cash or any other equivalent (such as a prepaid card) through the user interface unit 6, and then checks whether the inputted cash or the like is genuine and whether the amount of money is proper. In accordance with the check result, the billing section 33 enables the processes of recording and reproducing the contents data to be executed.
  • Alternatively, the billing section 33 may prompt the user to input the user's credit card number or ID information through the user interface unit 6, and then refer to an authentication server or the like about whether the inputted information is proper. In accordance with the authentication result, the billing section 33 may permit the processes of recording and reproducing the contents data to be executed.
  • One example of contents data processing executed in the contents data processing apparatus 100 a of FIG. 3 will be described below.
  • FIG. 4 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 100 a of FIG. 3. In FIG. 4, the same symbols as those in FIG. 2 denote steps in each of which similar processing is executed as in FIG. 2.
  • As seen from comparing FIGS. 4 and 2, a flowchart of FIG. 4 differs from the flowchart of FIG. 2 in that billing processes (step ST114 and step ST115) are inserted respectively between steps ST104 and ST105 and between steps ST107 and ST108.
  • FIG. 5 is a flowchart for explaining one example of the billing process.
  • According to the flowchart of FIG. 5, the billing section 33 first determines whether the contents data inputted from the contents data input unit 5 is charged data (step S201). If the inputted contents data is free, the subsequent billing process is skipped.
  • If the inputted contents data is charged data, whether to purchase the contents data or not is selected based on user's judgment inputted from the user interface unit 6 (step ST202). If the purchase of the contents data is selected in step ST202, predetermined payment information is inputted from the user interface unit 6 (step ST203). Then, the billing section 33 checks whether the inputted payment information satisfies a predetermined payment condition (step ST204).
  • The payment condition used in the above step is set depending on the character information stored in the character information storing unit 1. The payment condition becomes more severe or moderate depending on, for example, the growth of a character.
  • In accordance with the result of checking the payment information in step ST204, whether to release the limitation on processing of the contents data or not is selected (step ST205). If the release of the limitation is selected, the billing section 33 releases the limitation on processing of the contents data (step ST206). For example, a process of decrypting the encrypted contents data is executed.
  • If the user does not select the purchase of the contents data in step ST202, or if the release of the limitation on processing of the contents data is rejected in step ST205, the step of releasing the limitation on processing of the contents data (step ST206) and the subsequent steps of processing the contents data (steps ST105 and ST108) are both skipped. The process flow shifts to a step of updating the character (step ST112).
  • The above-described contents data processing apparatus 100 a of FIG. 3 can provide similar advantages as those in the contents data processing apparatus 100 of FIG. 1. In addition, since the payment condition in purchasing the charged contents data is changed depending on the character information, the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • A second embodiment of the present invention will be described below.
  • In each of the above-described contents data processing apparatuses of FIGS. 1 and 3, the character information is stored in the contents data processing apparatus, i.e., information associated with the contents data processing apparatus. On the other hand, in the second embodiment described below, the character information is associated with the contents data. Therefore, even when, for example, the same contents data is processed by the same contents data processing apparatus, different characters are reproduced if the character information associated with one content data differs from that associated with another content data.
  • A contents data processing apparatus according to the second embodiment has five configuration examples, which are explained in sequence in the following description.
  • FIRST CONFIGURATION EXAMPLE
  • FIG. 6 is a schematic block diagram showing a first configuration example of the contents data processing apparatus according to the second embodiment of the present invention.
  • A contents data processing apparatus 101 shown in FIG. 6 comprises a character information reproducing unit 2, a contents data input unit 5, a user interface (I/F) unit 6 and a processing unit 7. Note that, in FIG. 6, the same symbols as those in FIG. 1 denote the same components as those in FIG. 1.
  • The processing unit 7 executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5. On this occasion, the quality of the processing executed by the processing unit 7 is changed depending on the character information associated with the inputted contents data.
  • For example, in a configuration in which the processing unit 7 comprises a contents data recording section 71 and a contents data reproducing section 72, as shown in FIG. 6, the quality in recording and reproducing the contents data is changed depending on the character information associated with the contents data.
  • The contents data recording section 71 includes a storage device using a storage medium, such as a hard disk or a semiconductor memory, and records contents data inputted from the contents data input unit 5 in the storage device with the quality depending on the associated character information. At the same time as recording the contents data, the associated character information is also recorded together.
  • Incidentally, the contents data recording section 71 may not include a storage device therein. In this case, the inputted contents data and character information may be recorded in a storage device accessible via a wireless or wired communication line.
  • As with the contents data recording section 31 shown in FIG. 1, the quality in recording of the contents data to be changed in the processing unit 7 depending on the character information contains, for example, image quality of image data included in the contents data, sound quality of voice data therein, the number of voice channels, a data compression method and rate, etc.
  • The effective period, during which the contents data recorded in the processing unit 7 is reproducible, may be set and changed depending on the character information.
  • More simply, the processing unit 7 may just permit or prohibit the recording of the contents data depending on the character information.
  • The contents data reproducing section 72 includes, for example, an image reproducing device, such as a display, for reproducing image information, and a voice reproducing device, such as a speaker, for reproducing voice information. The contents data reproducing section 72 reproduces image information and voice information, which are contained in the contents data inputted from the contents data input unit 5 or in the contents data read out of the contents data recording section 71, with the quality depending on the character information associated with the contents data.
  • Incidentally, the contents data reproducing section 72 may include neither image reproducing device nor voice reproducing device. In this case, reproduced image and voice information may be outputted to the user interface unit 6 or any other suitable device so as to reproduce the contents data in the device at the output destination.
  • As with the recording quality mentioned above, the quality in reproduction of the contents data to be changed in the processing unit 7 depending on the character information contains, for example, image quality, sound quality, the number of voice channels, a data compression method and rate, etc.
  • The effective number of times and the effective period, at and during which the contents data recorded in the contents data recording section 71 of the processing unit 7 is reproducible, may be set and changed depending on the character information.
  • More simply, the processing unit 7 may just permit or prohibit the reproduction of the contents data depending on the character information.
  • Further, when there is contents data additionally supplemented with main contents data (hereinafter referred to as “supplemental contents data”), the processing unit 7 may change the processing executed on the supplemental contents data depending on the character information associated with the contents data. In the configuration example of FIG. 6, for example, the processing unit 7 permits or prohibits the process of recording the supplemental contents data in the contents data recording section 71 and the process of reproducing the supplemental contents data in the contents data reproducing section 72, or changes the quality in recording and reproduction of the supplemental contents data depending on the character information associated with the contents data.
  • One example of contents data processing executed in the contents data processing apparatus 101 of FIG. 6 will be described below with reference to a flowchart of FIG. 7.
  • First, the processing unit 7 prompts the user to select a process through user interface unit 6. Then, the user selects one of first to third processes and the selection result is inputted from the user interface unit 6 to the processing unit 7 (step ST301).
  • First Process:
  • In a first process, a process of recording contents data inputted from the contents data input unit 5 in the contents data recording section 71 is executed.
  • First, contents data is inputted from the contents data input unit 5 (step ST302), and character information associated with the inputted contents data is reproduced in the character information reproducing unit 2 (step ST303).
  • Then, based on the character information associated with the inputted contents data, it is determined whether recording of the contents data is permitted (step ST304). If the recording is permitted, the inputted contents data is recorded in the contents data recording section 71 (step ST305). At this time, the quality in recording the contents data is set depending on the character information. For example, recording of supplemental contents data, such as words information, is permitted or prohibited in accordance with the character information, and the quality in recording the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined in step ST304 based on the character information associated with the inputted contents data that the recording of the contents data is not permitted, the process of recording the contents data (step ST305) is skipped.
  • Second Process:
  • In a second process, a process of reproducing contents data inputted from the contents data input unit 5 in the contents data reproducing section 72 is executed.
  • First, contents data is inputted from the contents data input unit 5 (step ST306), and character information associated with the inputted contents data is reproduced in the character information reproducing unit 2 (step ST307).
  • Then, based on the character information associated with the inputted contents data, it is determined whether reproduction of the contents data is permitted (step ST308). If the reproduction is permitted, the inputted contents data is reproduced in the contents data reproducing section 72 (step ST309). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined in step ST308 based on the character information associated with the inputted contents data that the reproduction of the contents data is not permitted, the process of reproducing the contents data (step ST309) is skipped.
  • Third Process:
  • In a third process, a process of reading the contents data recorded in the contents data recording section 71 and reproducing it in the contents data reproducing section 72 is executed.
  • First, desired contents data is read out of the contents data recording section 71 (step ST310), and character information associated with the read-out contents data is reproduced (step ST311).
  • Then, based on the character information associated with the contents data, it is determined whether reproduction of the contents data is permitted (step ST312). If the reproduction is permitted, the read-out contents data is reproduced in the contents data reproducing section 72 (step ST313). At this time, the quality in reproducing the contents data is set depending on the character information. For example, reproduction of supplemental contents data is permitted or prohibited in accordance with the character information, and the quality in reproducing the supplemental contents data is set depending on the character information.
  • On the other hand, if it is determined in step ST312 based on the character information associated with the read-out contents data that the reproduction of the contents data is not permitted, the process of reproducing the contents data (step ST313) is skipped.
  • In the contents data processing described above, it is premised that the character information associated with the contents data is updated in the stage before the character information is supplied to the contents data processing apparatus 101, and the character information is not changed inside the contents data processing apparatus 101.
  • For example, each time contents data is downloaded from a contents data supply apparatus (not shown) to the contents data processing apparatus 101, character information associated with the downloaded contents data is updated in the contents data supply apparatus. By updating the character information for each download made by different users, it is possible to bring up a character changing in linkage with popularity of the contents data.
  • As an alternative example, character information depending on the result of another character bringing-up game, which has been played by a user, is associated with contents data when the user downloads the contents data from the contents data supply apparatus.
  • With the contents data processing apparatus 101 of FIG. 6, therefore, since the fun of the character bringing-up game is added to the fun with ordinary processing of contents data, increased amusingness can be given to users in processing the contents data as with the contents data processing apparatuses of FIGS. 1 and 3. Thus, since users can enjoy not only the growth of a character, but also changes in processing of the contents data with the growth of the character, it is possible to give the users increased amusingness.
  • SECOND CONFIGURATION EXAMPLE
  • FIG. 8 is a schematic block diagram showing a second configuration example of a contents data processing apparatus according to the second embodiment.
  • A contents data processing apparatus 101 a shown in FIG. 8 comprises a character information reproducing unit 2, a contents data input unit 5, a user interface (I/F) unit 6, and a processing unit 7 a. Note that, in FIG. 8, the same symbols as those in FIG. 6 denote the same components as those in FIG. 6.
  • As with the processing unit 7 shown in FIG. 6, the processing unit 7 a executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5, and changes the quality in processing of the contents data depending on the character information associated with the inputted contents data.
  • When the inputted contents data is charged data, the processing unit 7 a limits the processing of the charged contents data. For example, the processing unit 7 a disables the processing of the charged contents data or limits details of the processing in comparison with the ordinary processing. When the contents data is encrypted, the processing unit 7 a stops a decrypting process to disable the processing of the encrypted contents data.
  • The above-described limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6. More specifically, when predetermined payment information is inputted from the user interface unit 6, the processing unit 7 a checks whether the inputted payment information satisfies a predetermined payment condition. If the predetermined payment condition is satisfied, the above-described limitation on the processing of the charged contents data is released so that the content data processing in response to a user's processing request inputted from the user interface unit 6 can be executed. For example, the processing unit 7 a decrypts the encrypted charged contents data, thereby enabling the charged contents data to be processed.
  • The payment condition used in the step of checking the payment information by the processing unit 7 a is changed depending on the character information associated with the content data. In other words, the payment condition becomes more severe or moderate depending on the growth of a character associated with the content data.
  • The processing unit 7 a comprises, as shown in FIG. 8, a billing section 73 in addition to a contents data recording section 71 and a contents data reproducing section 72.
  • The billing section 73 has the same function as the billing section 33, shown in FIG. 3, except for that the payment condition for the charged content data is changed depending on the character information associated with the charged content data.
  • More specifically, the billing section 73 determines whether the contents data inputted from the contents data input unit 5 is charged data, and then displays the determination result on the user interface unit 6. When predetermined payment information corresponding to the display is inputted from the user interface unit 6, the billing section 73 checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 73 releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 71 and the contents data reproducing section 72. If the inputted payment information does not satisfy the predetermined payment condition, the billing section 73 limits the processing of the charged contents data.
  • FIG. 9 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 a of FIG. 8. In FIG. 9, the same symbols as those in FIG. 7 denote steps in each of which similar processing is executed as in FIG. 7.
  • As seen from comparing FIGS. 9 and 7, a flowchart of FIG. 9 differs from the flowchart of FIG. 7 in that billing processes (step ST315 and step ST316) are inserted respectively between steps ST304 and ST305 and between steps ST308 and ST309.
  • Each of those billing processes is substantially the same as the above-described process executed in accordance with the flowchart of FIG. 5 except for that the payment condition as a basis for checking the payment information in step ST204 is changed depending on the character information associated with the contents data.
  • Thus, the above-described contents data processing apparatus 101 a of FIG. 8 can provide similar advantages as those in the contents data processing apparatus 101 of FIG. 6. In addition, since the payment condition in purchasing the charged contents data is changed depending on the character information, the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • THIRD CONFIGURATION EXAMPLE
  • FIG. 10 is a schematic block diagram showing a third configuration example of a contents data processing apparatus according to the second embodiment.
  • A contents data processing apparatus 101 b shown in FIG. 10 comprises a character information reproducing unit 2, a contents data input unit 5, a user interface (I/F) unit 6, and a processing unit 7 b. Note that, in FIG. 10, the same symbols as those in FIG. 6 denote the same components as those in FIG. 6.
  • As with the processing unit 7 shown in FIG. 6, the processing unit 7 b executes predetermined processing designated by a user through the user interface unit 6 on contents data inputted from the contents data input unit 5, and changes the quality in processing of the contents data depending on the character information associated with the inputted contents data.
  • The processing unit 7 b updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data. The character information updated by the processing unit 7 b is recorded in a contents data recording section 71 in association with the contents data.
  • The processing unit 7 b comprises, as shown in FIG. 10, a character information updating section 74 in addition to a contents data recording section 71 and a contents data reproducing section 72.
  • In a process of recording the contents data inputted from the contents data input unit 5 in the contents data recording section 71 (i.e., first process) and in a process of reading and reproducing the contents data, which is recorded in the contents data recording section 71, in the contents data reproducing section 72 (i.e., third process), the character information updating section 74 updates the character information associated with the contents data, which is to be processed, depending on the run status of the first to third processes and then records the updated character information in the contents data recording section 71. For example, the updating unit 74 increases the degree of growth of the character depending on the total number of times of runs of those processes, or reduces the degree of growth of the character if the frequency of runs of those processes during a certain period exceeds below a certain level.
  • FIG. 11 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 b of FIG. 10. In FIG. 11, the same symbols as those in FIG. 7 denote steps in each of which similar processing is executed as in FIG. 7.
  • As seen from comparing FIGS. 11 and 7, a flowchart of FIG. 11 differs from the flowchart of FIG. 7 in that a process of updating the character (step ST317) is inserted after the first to third processes.
  • More specifically, after run of one the first to third processes, the character information associated with the contents data, which is to be processed, is updated in step ST317 depending on the run status of the processing executed on the contents data, and the updated character information is recorded in the contents data recording section 71 in association with the contents data.
  • In the case of reproducing the inputted contents data in the second process, however, the processes of updating and recording the character information are not executed because the process of recording the inputted contents data is not executed.
  • Thus, as the processing of the contents data is repeated many times, the character information associated with the contents data is updated depending on the run status of the contents data processing, and the form of the character reproduced in the character information reproducing section 72 is gradually changed.
  • Responsive to the change of the character, details of the contents data processing executed in the processing unit 7 b are also changed. For example, the process of recording the contents data, which has been disabled for the character in the initial state, becomes enabled with repeated reproduction of the contents data. As another example, the quality in reproducing the contents data is improved depending on the degree of growth of the character, or the reproduction of supplemental contents data is newly permitted.
  • With the contents data processing apparatus 101 b of FIG. 10, therefore, since the fun of the character bringing-up game is added to the fun with ordinary processing of contents data, increased amusingness can be given to users in processing the contents data.
  • Since users can enjoy not only the growth of a character, but also changes in processing of the contents data with the growth of the character, it is possible to give the users further increased amusingness.
  • FOURTH CONFIGURATION EXAMPLE
  • FIG. 12 is a schematic block diagram showing a fourth configuration example of a contents data processing apparatus according to the second embodiment.
  • A contents data processing apparatus 101 c shown in FIG. 12 comprises a character information reproducing unit 2, a user interface (I/F) unit 6, a processing unit 7 c, and a communication unit 8. Note that, in FIG. 12, the same symbols as those in FIG. 10 denote the same components as those in FIG. 10.
  • The processing unit 7 c has basically similar functions to those of the processing unit 7 b shown in FIG. 10. More specifically, the processing unit 7 c executes predetermined processing designated by a user through the user interface unit 6 on contents data received by the communication unit 8, and changes the quality in processing of the contents data depending on the character information associated with the received contents data. The processing unit 7 c updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data, and records the updated character information in association with the contents data.
  • Further, the processing unit 7 c executes a process of selecting desired one of the contents data recorded in a contents data recording section 71 in response to a user's instruction entered from the user interface unit 6, and a process of transmitting, via the communication unit 8 (described below), the selected contents data to other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses.
  • The communication unit 8 executes a process of exchanging contents data and character information with other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses. Any suitable communication method can be employed in the communication unit 8. For example, wireless or wired communication is usable as required. Alternatively, the communication may be performed via a network such as the Internet.
  • FIG. 13 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 c of FIG. 12. In FIG. 13, the same symbols as those in FIG. 11 denote steps in each of which similar processing is executed as in FIG. 11.
  • As seen from comparing FIGS. 13 and 11, a flowchart of FIG. 13 differs from the flowchart of FIG. 11 in that a fourth process is added in a step of selecting a process (step ST301 a).
  • More specifically, if the fourth process is selected in step ST301 a in response to a user's instruction entered from the user interface unit 6, desired contents data corresponding to the user's instruction is read out of the contents data recorded in the contents data recording section 71 (step ST318). The read-out contents data is transmitted from the communication unit 8 to other contents data processing apparatuses or the contents data supply server (step ST319). Then, as with the first to third processes, a process of updating the character is executed in step ST317.
  • Thus, the above-described contents data processing apparatus 101 c of FIG. 12 can provide similar advantages as those in the contents data processing apparatus 101 b of FIG. 10. In addition, the contents data processing apparatus 101 c can exchange or share contents data with the other contents data processing apparatuses or the contents data supply server. Therefore, since a user can bring up a character in cooperation with other users or exchange a character with other users, it is possible to give the user further increased amusingness.
  • FFIFT CONFIGURE EXAMPLE
  • FIG. 14 is a schematic block diagram showing a fifth configuration example of a contents data processing apparatus according to the second embodiment.
  • A contents data processing apparatus 101 d shown in FIG. 14 comprises a character information reproducing unit 2, a user interface (I/F) unit 6, a processing unit 7 d, and a communication unit 8. Note that, in FIG. 14, the same symbols as those in FIG. 12 denote the same components as those in FIG. 12.
  • The processing unit 7 d has basically similar functions to those of the processing unit 7 c shown in FIG. 12. More specifically, the processing unit 7 d executes predetermined processing designated by a user through the user interface unit 6 on contents data received by the communication unit 8, and changes the quality in processing of the contents data depending on the character information associated with the received contents data. The processing unit 7 d updates the character information associated with the contents data, which is to be processed, depending on the run status of the processing executed on the contents data. Further, the processing unit 7 d selects desired one of the contents data recorded in a contents data recording section 71 in response to a user's instruction entered from the user interface unit 6, and then transmits the selected contents data from the communication unit 8 to other contents data processing apparatuses or a contents data supply server.
  • In addition, the processing unit 7 d also has a similar function to that of the processing unit 7 a shown in FIG. 8. More specifically, when the contents data received by the communication unit 8 is charged data, the processing unit 7 d limits the processing of the charged contents data. The limitation on the processing of the charged contents data is released upon predetermined payment information being inputted from the user interface unit 6. The payment condition in the step of checking the payment information is changed depending on the character information associated with the content data.
  • When the charged contents data is transmitted from the communication unit 8, the processing unit 7 d executes a process of limiting the use of the charged contents data, e.g., an encrypting process, as required.
  • The processing unit 7 d comprises, as shown in FIG. 14, a billing section 73 a in addition to a contents data recording section 71, a contents data reproducing section 72, and a character information updating section 74.
  • The billing section 73 a has the same function as the billing section 73, shown in FIG. 8, except for that the predetermined process of limiting the use of the charged contents data transmitted from the communication unit 8.
  • More specifically, the billing section 73 a determines whether the contents data received by the communication unit 8 is charged data, and then displays the determination result on the user interface unit 6. When predetermined payment information corresponding to the display is inputted from the user interface unit 6, the billing section 73 a checks whether the inputted payment information satisfies a predetermined payment condition. If the inputted payment information satisfies the predetermined payment condition, the billing section 73 a releases the limitation on processing of the charged contents data so that the charged contents data can be processed in the contents data recording section 71 and the contents data reproducing section 72. If the inputted payment information does not satisfy the predetermined payment condition, the billing section 73 a limits the processing of the charged contents data.
  • When the charged contents data is transmitted from the communication unit 8 to the other contents data processing apparatuses or the contents data supply server, the billing section 73 a executes a process of limiting the use of the charged contents data, e.g., an encrypting process, as required.
  • The character information recorded in the contents data recording section 71 together with the contents data is updated in the character information updating section 74 depending on the run status of the processing executed on the contents data (such as the number of times of processing runs and the frequency of processing runs).
  • FIG. 15 is a flowchart for explaining one example of the contents data processing executed in the above-described contents data processing apparatus 101 d of FIG. 14. In FIG. 15, the same symbols as those in FIG. 13 denote steps in each of which similar processing is executed as in FIG. 13.
  • As seen from comparing FIGS. 15 and 13, a flowchart of FIG. 15 differs from the flowchart of FIG. 13 in that billing processes (step ST315 and step ST316) are inserted respectively between steps ST304 and ST305 and between steps ST308 and ST309.
  • Each of those billing processes is substantially the same as the above-described process executed in accordance with the flowchart of FIG. 5 except for that the payment condition as a basis for checking the payment information in step ST204 is changed depending on the character information associated with the contents data.
  • The flowchart of FIG. 15 also differs from the flowchart of FIG. 13 in that a process of limiting the use of the contents data (step ST320) is inserted between a process of reading the contents data (step ST318) and a process of transmitting the contents data (step ST319) in a fourth process.
  • Stated otherwise, in the case of transmitting the contents data in the fourth process, the process of limiting the use of the contents data (e.g., encrypting process) is executed, as required, when the transmitted contents data is charged data.
  • Thus, the above-described contents data processing apparatus 101 d of FIG. 14 can provide similar advantages as those in the contents data processing apparatus 101 c of FIG. 12. In addition, since the payment condition in purchasing the charged contents data is changed depending on the character information, the fun in processing the contents data is further increased and users can feel even higher amusingness.
  • Note that the present invention is not limited to the first and second embodiment described above, but can be modified in various ways.
  • For example, a part or the whole of the configuration of the contents data processing apparatuses described above in the first and second embodiments, by way of example, can be realized using a processor, such as a computer, which executes processing in accordance with a program. The program may be stored in a storage device such as a hard disk and a semiconductor memory, or a storage medium such as a magnetic disk and a magneto-optical disk, and then read by the processor to execute the processing, as an occasion requires. The program may be stored in a server capable of communicating via a wired or wireless communication means, and then downloaded in the processor to execute the processing, as an occasion requires.
  • In the above-described second embodiment, the character information associated with the contents data may be only information indicating character properties, such as the nature and type of each character, or may contain image information and voice information reproducible in the character information reproducing section.
  • Stated otherwise, when image information and voice information are not contained in the character information, predetermined images and voices corresponding to the information indicating the character properties may be reproduced in the character information reproducing section. When image information and voice information are contained in the character information, those associated image information and voice information may be reproduced in the character information reproducing section.
  • While the above embodiments have been described in connection with the case in which the character information is stored beforehand, or the case in which the character information is inputted together with the contents data, the present invention is not limited to the above-described embodiments. Specific information for creating character information may be transmitted together with contents data, and the character information may be created using the specific information. Such a case will be described below in detail with reference to the drawings.
  • FIG. 16 is a schematic block diagram showing a configuration of a contents data processing apparatus according to a third embodiment of the present invention.
  • A contents data processing apparatus 200 shown in FIG. 16 comprises a processing unit 201, a recording unit 202, a reproducing unit 203, a user interface (I/F) unit 204, a character information creating unit 205, and a character memory 206.
  • The processing unit 201 executes a process of reproducing or recording contents data in response to a user's instruction entered from the user interface unit 204.
  • When the process of reproducing contents data is executed, the processing unit 201 reads contents data, which has been selected in response to the user's instruction, out of the contents data recorded in the recording unit 202, and then reproduces the read-out contents data depending in the reproducing unit 3. At this time, character information to be created corresponding to the read-out contents data is also reproduced in the character information creating unit 205 (described later in more detail) together with the read-out contents data.
  • For example, when the contents data and the character information are reproduced as images in the reproducing unit 203, the content and the character may be displayed on the same screen in a superimposed relation, or the character may be displayed to appear on the screen before or after reproduction of the content. When the reproducing unit 203 includes a plurality of displays, the content and the character may be displayed on respective different screens independently of each other.
  • When the process of recording contents data is executed, the processing unit 201 records character information, which has been created corresponding to the contents data, in the recording unit 202 in association with the contents data.
  • For example, the processing unit 201 may join the contents data and the character information into a single file and record the file in the recording unit 202, or may record the contents data and the character information in separate files.
  • The recording unit 202 records the contents data and the character information, which are supplied from the processing unit 201, under write control of the processing unit 201, and outputs the recorded contents data to the processing unit 201 under read control of the processing unit 201.
  • The recording unit 202 may be constituted using, for example, a stationary storage device such as a hard disk, or a combination of a removable storage medium such as a magneto-optical disk or a semiconductor memory card, a reader, and a write device.
  • The reproducing unit 203 reproduces the contents data and the character information under control of the processing unit 201.
  • The reproducing unit 203 includes, for example, a display for reproducing image information and a speaker for reproducing voice information, and reproduces images and voices corresponding to the contents data and the character information by those devices.
  • The user interface unit 204 includes input devices, such as a switch, a button, a mouse, a keyboard and a microphone, and transmits, to the processing unit 201, an instruction given from the user who performs a predetermined operation using those input devices. The user interface unit 204 may output the processing result of the processing unit 201 to the user using output devices such as a display, a lamp and a speaker.
  • The character information creating unit 205 creates character information depending on specific information associated with the contents data that is recorded or reproduced in the processing unit 201.
  • The character information created in the character information creating unit 205 contains information for reproducing images, voices, etc., of a virtual character corresponding to the content in the reproducing unit 203.
  • When the specific information associated with the contents data is information regarding the price of the relevant contents data, the character information may be created in the creating unit 205 depending on the price information.
  • For example, whether the price of the relevant contents data reaches a predetermined amount is determined based on the price information, and the character information depending on the determination result is created in the creating unit 205. As another example, when the price of the relevant contents data exceeds a certain level, the character information may be created so as to dress up clothing of the character.
  • The character information may be created in the creating unit 205 depending on information that is associated with the content data and indicates the type of the relevant content data.
  • For example, it is determined in accordance with the type information that the type of the relevant content data corresponds to which one of predetermined types, and the character information depending on the determination result is created in the creating unit 205.
  • To describe music content as an example, based on information of music genre associated with music content data (or other related information), it is determined that the genre of the music content corresponds to which one of predetermined genres (such as rock, jazz, Japanese popular songs, and classics). The character information depending on the determination result is then created in the creating unit 205. For example, when it is determined that the music content is a Japanese popular song, character information such as dressing a character in kimono (Japanese traditional clothe) is created in the creating unit 205.
  • The character information may be created depending on information that is associated with the content data and indicates the number of times at which the relevant content data has been copied in the past.
  • For example, whether the number of times of copying of the relevant content data reaches a predetermined value is determined in accordance with the information indicating the number of times of copying, and the character information depending on the determination result is created in the creating unit 205. When the number of times of copying exceeds the predetermined value, character information such as creating a character in twins may be created in the creating unit 205.
  • The character information may be newly created depending on the character information that is associated with the content data and has been created before.
  • For example, the character information is not changed regarding unchangeable attributes such as the price and the type, while the character information is changed depending on changeable attributes such as the number of times of copying or reproduction of the content data. The creating unit 205 may create character information such that information indicating the number of times of copying or reproduction of the contents data contained in the character information is updated each time the contents data is copied or reproduced, and the image and voice of a character is changed when the number of times of copying or reproduction reaches a predetermined value.
  • The character information created in the creating unit 205 may contain ID information for identifying the owner of a character. By associating the ID information with the contents data, the following advantage is obtained. When fraudulently copied contents data, for example, is found, it is possible to specify the owner of the character, for which fraudulent copying was made, by checking the ID information contained in the relevant character information.
  • When another person's ID information is contained in the character information associated with the contents data, the character information for the user may be created in the creating unit 205 depending on the character information for the other person.
  • For example, it is determined whether the ID information contained in the character information associated with the contents data is identical to the user's own ID information. If not identical, character information for the user is newly created depending on information regarding the type, nature, state, etc. of a character contained in the character information associated with the contents data. When the character for the other person is child, the character information may be created such that the character for the user is provided as a child of almost the same age.
  • When the character information containing another person's ID information includes a message issued from a character contained therein, the character information causing the character for the user to reply the message. For example, when the character for the other person issues a message “How many years old are you?”, the character information may be created such that the character for the user replies “I am five years old”.
  • The character information creating unit 205 has the functions described above.
  • The character memory 206 stores information necessary for a character to grow and change. The information stored in the character memory 206 is read and used, as required, when the character information is created in the character information creating unit 205. For example, the character memory 206 stores information necessary for reproducing character images and voices in the reproducing unit 3. In addition, the character memory 206 may also store the ID information for identifying the user of the contents data processing apparatus.
  • The character memory 206 may be constituted using, for example, a stationary storage device such as a hard disk, or a combination of a removable storage medium, such as a magneto-optical disk or a semiconductor memory card, and a reader.
  • The operation of the contents data processing apparatus 200 shown in FIG. 16 will be described below.
  • FIG. 17 is a flowchart for explaining one example of the contents data processing executed in the contents data processing apparatus 200 of FIG. 16.
  • Step ST401:
  • Contents data to be processed is inputted to the processing unit 201. In the example shown in FIG. 16, in response to a user's instruction entered from the user interface unit 204, the contents data instructed by the user to be processed is selected from among the contents data stored in the recording unit 202, and then read into the processing unit 201.
  • Step ST402:
  • Character information S5 is created depending on specific information S1 associated with the inputted contents data.
  • More specifically, image and voice information of a character (e.g., information regarding the face, clothing and voice of a character, and messages) is read our of the character memory 206 and processed depending on information regarding the price, the type, the number of times of copying, the number of times of reproduction, etc. of the contents data, whereby the character information is created in the creating section 205.
  • When another person's ID information is contained in the character information, character information for the user is created in the creating unit 205 depending on the character information for the other person.
  • FIG. 18 is a flowchart for explaining one example of the more detailed process in the step ST402 of creating the character information.
  • According to the flowchart of FIG. 18, it is first determined whether the character information created before is associated with the inputted contents data (step ST4021). If it is determined in step ST4021 that the character information created before is not associated with the inputted contents data, new character information is created (step ST4022).
  • If it is determined in step ST4021 that the character information created before is associated with the inputted contents data, it is then determined whether the user's own ID information is contained in the character information created before (step ST4023). If it is determined that the another person's ID information is contained in the character information created before, character information for the user is newly created depending on information regarding the type, nature, state, etc. of a character contained in the character information for the other person (step ST4024).
  • If it is determined in step ST4023 that the user's own ID information is contained the character information associated with the inputted contents data, the character information is updated as required (step ST4025). The updating process in step ST4025 is executed, for example, by updating the character information when the number of times of copying or reproduction of the contents data reaches a predetermined value.
  • Step ST403
  • This step ST403 executes a process of recording or reproducing the inputted contents data and the character information created corresponding to the former.
  • More specifically, in the process of recording the contents data, the updated or newly created character information is recorded in the recording unit 202 in association with the contents data.
  • In the process of reproducing the contents data, the updated or newly created character information or the character information read out of the recording unit 202 in association with the contents data is reproduced together with the contents data.
  • With contents data processing apparatus 200 shown in FIG. 16, as described above, since a game factor of bringing up a character associated with the contents data is added to the ordinary fun in reproducing the contents data, users can feel higher amusingness with processing of the contents data.
  • The brought-up character is not a character associated with the contents data processing apparatus, but a character that moves and grows with the contents data. Therefore, each time different contents data is reproduced, users can be given with the opportunities of enjoying characters in different grown-up states. As a result, users can be more surely kept from becoming weary in bringing up characters in comparison with the case of bringing up only one character.
  • By containing the ID information of the character owner in the character information, it is possible to track down a user who has fraudulently copied the contents data.
  • When the character information for the user is newly created depending on the character information for another person, only the character for the user can be reproduced. By displaying the character for the user together with a character for the other person, however, the user can feed as if both the characters make communications with each other.
  • For example, when the character for the other person is a teacher type, the character for the user is set to a pupil type correspondingly and displayed together with the teacher type character.
  • As another example, when the character for the other person issues a message “How much is it?”, the character for the user is set to issue a message “You may have it for nothing” correspondingly and displayed together with the character for the other person.
  • Thus, since a user can make communications with another user, who records or reproduces contents data, via a virtual character reproduced in the reproducing unit 203, the user can be give with further increased amusingness.
  • A configuration of a contents data processing apparatus according to a fourth embodiment of the present invention will be described below with reference to FIGS. 19 and 20.
  • In the fourth embodiment, character information is created depending on time information and/or position information.
  • FIG. 19 is a schematic block diagram showing the configuration of the contents data processing apparatus according to the fourth embodiment of the present invention.
  • A contents data processing apparatus 200 a shown in FIG. 19 comprises a processing unit 201, a recording unit 202, a reproducing unit 203, a user interface (I/F) unit 204, a character information creating unit 205 a, a character memory 206, a time information producing unit 207, and a position information producing unit 208. Note that the same components in FIG. 19 as those in FIG. 16 are denoted by the same symbols, and the above description should be referred to for details of those components.
  • The time information producing unit 207 produces time information, such as information regarding the time of day, information regarding time zones (forenoon, afternoon, etc.) per day, information regarding the day of week, information regarding the month, and information regarding the season.
  • The position information producing unit 208 produces information regarding the geographical position of the contents data processing apparatus 200 a. For example, the producing unit 208 may produce position information by utilizing a mechanism of the GPS (Global Positioning System) for measuring the geographical position in accordance with a signal from a stationary satellite, etc.
  • The character information creating unit 205 a has, in addition to the same function as that of the character information creating unit 205 shown in FIG. 16, the function of creating character information depending on the time information produced in the time information producing unit 207 and the position information produced in the position information producing unit 208.
  • For example, when the contents data is processed in the summer night, the creating unit 205 a creates the character information dressing a character in a bath gown (informal kimono for summer wear).
  • When the contents data is processed in Hawaii, the creating unit 205 a creates the character information dressing a character in an aloha shirt.
  • In the case of reproducing the contents data, the form of a character created in the creating unit 205 a may be changed during the reproduction depending on the time information and/or the position information.
  • FIG. 20 is a flowchart for explaining one example of a process of reproducing contents data executed in the contents data processing apparatus 200 a of FIG. 19. More specifically, FIG. 20 shows one example of the detailed process in step ST403 in the flowchart of FIG. 17.
  • According to the process of reproducing contents data shown in the flowchart of FIG. 20, after the start of reproduction of the contents data (step ST4031), it is determined whether the time of day indicated by the time information produced by the time information producing unit 207 reaches the predetermined time of day (step ST4032). If it is determined in step ST4032 that the indicated time of day reaches the predetermined time of day, the character information under the reproduction is changed depending on the predetermined time of day (step ST4033). For example, when the time of day at which the contents data is being reproduced reaches midnight 12:00, the clothing of the character is changed to pajamas.
  • Then, it is determined whether the district, in which the contents data processing apparatus 200 a is located and which is indicated by the position information produced from the position information producing unit 208, has changed (step ST4034). If it is determined that the district where the contents data processing apparatus 200 a is located has changed, the character information is also changed depending on the district to which the location of the contents data processing apparatus 200 a has changed (step ST4035). For example, when it is determined that the contents data processing apparatus 200 a which is reproducing contents data regarding professional baseball has moved from one to another district, the mark of a baseball cap put on the character is changed to the mark representing the professional baseball team in the district to which the contents data processing apparatus 200 a has moved.
  • After the above-described determination regarding the time information (step ST4032) and the above-described determination regarding the district where the contents data processing apparatus 200 a is located (step ST4034), it is determined whether the reproduction of the contents data is to be completed, and whether the end of reproduction of the contents data is instructed from the user interface unit 204 (step ST4036). If it is determined that the reproduction of the contents data is to be completed, or that the end of reproduction of the contents data is instructed, the process of reproducing the contents data is brought into an end (step ST4037). If it is determined in step ST4036 that the reproduction of the contents data is not to be completed and that the end of reproduction of the contents data is not instructed, i.e., that the reproduction of the contents data is to be continued, the process flow returns to step ST4032 to continue the process of reproducing the contents data.
  • Thus, the above-described contents data processing apparatus 200 a of FIG. 19 can provide similar advantages as those in the contents data processing apparatus 200 of FIG. 16. In addition, since the character information can be changed depending on the time and location at and in which contents data is processed, it is possible to provide a variety variations in patterns for bringing up a character and to give users further increased amusingness.
  • A fifth embodiment of a contents data processing apparatus will be described below with reference to FIG. 21.
  • FIG. 21 is a schematic block diagram showing a configuration of the contents data processing apparatus according to the fifth embodiment of the present invention.
  • A contents data processing apparatus 200 b shown in FIG. 21 comprises a processing unit 201 a, a recording unit 202, a reproducing unit 203, a user interface (I/F) unit 204, a character information creating unit 205, a character memory 206, and a communication unit 209. Note that components in FIG. 21 common to those in FIG. 16 are denoted by the same symbols, and a detailed description of those components is omitted here.
  • The communication unit 209 executes a process of exchanging contents data and character information with other contents data processing apparatuses or a contents data supply server for supplying contents data to those contents data processing apparatuses. Any suitable communication method can be employed in the communication unit 209. For example, wired or wireless communication is usable as required. Alternatively, the communication unit 209 may communicate via a network such as the Internet.
  • The processing unit 201 a has, in addition to the same function as that of the processing unit 201 shown in FIG. 16, the function of selecting desired one of the contents data recorded in the recording unit 202 in response to a user's instruction entered from the user interface unit 204, and then transmitting the selected contents data from the communication unit 209 to other contents data processing apparatuses or a contents data supply server for supplying contents data to the other contents data processing apparatuses.
  • The communication unit 209 may be controlled to execute a process of accessing the other contents data processing apparatuses or the contents data supply server and then downloading the contents data held in them as contents data to be processed, and a process of, in response to a download request from the other contents data processing apparatuses or the contents data supply server, supplying the contents data recorded in the recording unit 202 to them.
  • Thus, the above-described contents data processing apparatus 200 b of FIG. 21 can provide similar advantages as those in the contents data processing apparatus 200 of FIG. 16. In addition, since the contents data processing apparatus 200 b can exchange or share contents data with the other contents data processing apparatuses or the contents data supply server. Therefore, since a user can bring up a character in cooperation with other users or exchange a character with other users, it is possible to give the user further increased amusingness.
  • Note that the present invention is not limited to the third to fifth embodiments described above, but can be modified in various ways.
  • For example, a part or the whole of the configuration of the contents data processing apparatuses described above in the third to fifth embodiments, by way of example, can be realized using a processor, such as a computer, which executes processing in accordance with a program. The program may be stored in a storage device such as a hard disk and a semiconductor memory, or a storage medium such as a magnetic disk and a magneto-optical disk, and then read by the processor to execute the processing, as an occasion requires. The program may be stored in a server capable of communicating via a wired or wireless communication means, and then downloaded in the processor to execute the processing, as an occasion requires.
  • The character information recorded in the recording unit 202 in association with the contents data may contain direct information used for reproducing a character (e.g., image information and voice information of a character), or may contain indirect information for designating the reproduced form of a character (e.g., information of numbers made correspondent respective patterns of a character) instead of containing the direct information. In the latter case, since the data amount of the character information is reduced in comparison with that of the character information containing image information and voice information, the required recording capacity of the recording unit 202 can be held down.

Claims (8)

1. A contents data processing apparatus comprising:
a reproducing block for reproducing contents data from a recording medium;
a character information generation block for generating character information based on first information accompanied with said contents data; and
a character memory for storing second information regarding growth or change.
2. The contents data processing apparatus according to claim 1, the first information includes at least one of price of contents data, classification, copy numbers, or reproduction number.
3. The contents data processing apparatus according to claim 1, the second information includes at least one of character voice or character image.
4. The contents data processing apparatus according to claim 1, further comprising a time information generator for generating time information, wherein the generated time information influences generation of the character information.
5. The contents data processing apparatus according to claim 1, further comprising a geographical position information generator for generating geographical position information, the generated geographical position information effects generation of the character information.
6. The contents data processing apparatus according to claim 1, further comprising a communication line for exchanging the contents data or character information between the contents data processing apparatus and a contents supply server.
7. The contents data processing apparatus according to claim 6, the communication line includes at least one of wired or wireless communication line.
8. The contents data processing apparatus according to claim 1, the character memory includes at least one of a hard disk drive, a semiconductor memory, a floppy disk, or an optical magneto disc.
US11/827,629 2002-02-20 2007-07-12 Contents data processing apparatus and method Abandoned US20070254737A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/827,629 US20070254737A1 (en) 2002-02-20 2007-07-12 Contents data processing apparatus and method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2002043660A JP2003242289A (en) 2002-02-20 2002-02-20 Device and method and program for contents processing
JPP2002-043660 2002-02-20
JPP2002-053502 2002-02-28
JP2002053502A JP4003482B2 (en) 2002-02-28 2002-02-28 Content processing apparatus, method thereof, and program
US10/364,495 US20030166414A1 (en) 2002-02-20 2003-02-11 Contents data processing apparatus and method
US11/827,629 US20070254737A1 (en) 2002-02-20 2007-07-12 Contents data processing apparatus and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/364,495 Continuation US20030166414A1 (en) 2002-02-20 2003-02-11 Contents data processing apparatus and method

Publications (1)

Publication Number Publication Date
US20070254737A1 true US20070254737A1 (en) 2007-11-01

Family

ID=27806908

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/364,495 Abandoned US20030166414A1 (en) 2002-02-20 2003-02-11 Contents data processing apparatus and method
US11/827,629 Abandoned US20070254737A1 (en) 2002-02-20 2007-07-12 Contents data processing apparatus and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/364,495 Abandoned US20030166414A1 (en) 2002-02-20 2003-02-11 Contents data processing apparatus and method

Country Status (1)

Country Link
US (2) US20030166414A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060002686A1 (en) * 2004-06-29 2006-01-05 Matsushita Electric Industrial Co., Ltd. Reproducing method, apparatus, and computer-readable recording medium
US20090199102A1 (en) * 2008-01-31 2009-08-06 Phm Associates Limited Communication method, apparatus and system for a retail organization

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7534157B2 (en) 2003-12-31 2009-05-19 Ganz System and method for toy adoption and marketing
JP4385863B2 (en) * 2004-06-23 2009-12-16 株式会社セガ Online game fraud detection method
JP5000093B2 (en) * 2005-02-21 2012-08-15 ソニー株式会社 Data processing method, portable playback device and computer
JP5229484B2 (en) 2009-01-28 2013-07-03 任天堂株式会社 Information processing system, program, and information processing apparatus
JP5527721B2 (en) 2009-01-28 2014-06-25 任天堂株式会社 Program and information processing apparatus
JP5690473B2 (en) * 2009-01-28 2015-03-25 任天堂株式会社 Program and information processing apparatus
JP5813912B2 (en) * 2009-01-28 2015-11-17 任天堂株式会社 Program, information processing apparatus, and information processing system
US9186575B1 (en) * 2011-03-16 2015-11-17 Zynga Inc. Online game with animal-breeding mechanic
US10682575B1 (en) 2019-10-03 2020-06-16 Mythical, Inc. Systems and methods for generating in-game assets for a gaming platform based on inheriting characteristics from other in-game assets
US11389735B2 (en) * 2019-10-23 2022-07-19 Ganz Virtual pet system
US11358059B2 (en) 2020-05-27 2022-06-14 Ganz Live toy system
US11192034B1 (en) * 2020-07-08 2021-12-07 Mythical, Inc. Systems and methods for determining how much of a created character is inherited from other characters

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6152821A (en) * 1998-06-03 2000-11-28 Konami Co., Ltd. Video game machine, method for guiding designation of character position, and computer-readable recording medium on which game program implementing the same method is recorded
US6200216B1 (en) * 1995-03-06 2001-03-13 Tyler Peppel Electronic trading card
US6336865B1 (en) * 1999-07-23 2002-01-08 Fuji Photo Film Co., Ltd. Game scene reproducing machine and game scene reproducing system
US20020042921A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for caching data in media-on-demand systems
US20020049087A1 (en) * 2000-09-07 2002-04-25 Teruyuki Ushiro Information processing apparatus, information processing method, and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6200216B1 (en) * 1995-03-06 2001-03-13 Tyler Peppel Electronic trading card
US6152821A (en) * 1998-06-03 2000-11-28 Konami Co., Ltd. Video game machine, method for guiding designation of character position, and computer-readable recording medium on which game program implementing the same method is recorded
US6336865B1 (en) * 1999-07-23 2002-01-08 Fuji Photo Film Co., Ltd. Game scene reproducing machine and game scene reproducing system
US20020049087A1 (en) * 2000-09-07 2002-04-25 Teruyuki Ushiro Information processing apparatus, information processing method, and recording medium
US20020042921A1 (en) * 2000-10-11 2002-04-11 United Video Properties, Inc. Systems and methods for caching data in media-on-demand systems

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060002686A1 (en) * 2004-06-29 2006-01-05 Matsushita Electric Industrial Co., Ltd. Reproducing method, apparatus, and computer-readable recording medium
US20090199102A1 (en) * 2008-01-31 2009-08-06 Phm Associates Limited Communication method, apparatus and system for a retail organization
US9111302B2 (en) * 2008-01-31 2015-08-18 Phm Associates Limited Communication method, apparatus and system for a retail organization

Also Published As

Publication number Publication date
US20030166414A1 (en) 2003-09-04

Similar Documents

Publication Publication Date Title
US20070254737A1 (en) Contents data processing apparatus and method
US10176177B2 (en) Information processing apparatus and associated method of content exchange
US6349339B1 (en) System and method for utilizing data packets
US6488508B2 (en) Interactive communication system for communicating video game and karaoke software
US7294776B2 (en) Content supply method and apparatus
JP4081980B2 (en) Content providing service system, server device, and client device
US8959174B2 (en) System, apparatus, method and program for processing information
EP0938075B1 (en) Terminal apparatus, information service center, transmitting system, and transmitting method
CA2430062A1 (en) Distribution device, terminal device, and program and method for use therein
CN104168307B (en) Data distributing method, server, data distribution systems and terminal device
JP4003482B2 (en) Content processing apparatus, method thereof, and program
US7141733B2 (en) Karaoke apparatus, content reproducing apparatus, method of managing music piece data for a karaoke apparatus, and method of managing content data for content reproducing apparatus
JP4748096B2 (en) Content processing device
JPWO2005096308A1 (en) Content playback terminal
JP4663089B2 (en) User terminal, data distribution server, data purchase method, data distribution method, data distribution system, data reproduction apparatus, and data reproduction method
JPH1039880A (en) Karaoke system
JP2001216380A (en) Charging system for distributing contents information and its method
KR100923095B1 (en) Handy-Terminal and Storage-Media saving a packaged file of multimedia, System offering a packaged file of multimedia, Method of offering a multimedia and Method of playing a packaged file of multimedi
JP2002024184A (en) System and method for distributing contents information
JP2003255960A (en) Information distribution system
Ellis SCI Model 64 Sequencer (EMM Jun 1984)
JP2003132627A (en) Recording/reproducing device, recording method, reproducing device and distribution device
JP2010170580A (en) Data circulation system, server, and terminal device
JP2003333517A (en) Video contents recording apparatus, video contents recording method, video contents distribution recording system, video contents distribution apparatus, program, and medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKO, YOICHIRO;TORIYAMA, MITSURU;INOKUCHI, TATSUYA;AND OTHERS;REEL/FRAME:021662/0930;SIGNING DATES FROM 20030416 TO 20030423

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION