US20100318510A1 - Method for attaching tag to image of person - Google Patents
Method for attaching tag to image of person Download PDFInfo
- Publication number
- US20100318510A1 US20100318510A1 US12/526,282 US52628208A US2010318510A1 US 20100318510 A1 US20100318510 A1 US 20100318510A1 US 52628208 A US52628208 A US 52628208A US 2010318510 A1 US2010318510 A1 US 2010318510A1
- Authority
- US
- United States
- Prior art keywords
- candidates
- person
- candidate
- retrieved
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
Definitions
- the present invention relates to a method for tagging an image of a person with ease.
- the tagging information on an image of a certain person may include not only a name of the certain person but also a nickname, a mail address of the certain person and the like.
- the tendency toward digital convergence provides various multimedia functions to a mobile phone or other portable devices, which may have small-sized keys so that it is difficult to input various texts by manipulating the keys.
- twelve keys may be used to input English, Korean, numbers, special characters and the like.
- the tagging information on an image may be variously determined in general, the tagging information on an image of a person may be restrictively determined.
- the tagging information e.g., the name, the nickname and other information on a certain person included in the image, attached to the image, may be used to classify and search the image with ease.
- GUI Graphic User Interface
- GUI Graphic User Interface
- the GUI of a mobile phone or other portable devices capable of easily assisting a user to tag an image of a person.
- the image of the person can be easily classified and searched by using the tag attached thereto.
- FIG. 1 shows a flow chart of a method for tagging an image of a person in accordance with a first embodiment of the present invention
- FIG. 2 illustrates a part of a process included in the method in accordance with the first embodiment
- FIG. 3 provides the images of the N candidates in accordance with the first embodiment
- FIG. 4 depicts a flow chart showing a method for tagging an image of a person in accordance with a second embodiment of the present invention
- FIG. 5 provides a flow chart showing a method of tagging an image of a person in accordance with a third embodiment of the present invention
- FIG. 6 illustrates a part of the process included in the method in accordance with the third embodiment.
- FIG. 7 illustrates a part of the process included in the method in accordance with the third embodiment.
- a method for attaching tag information to an image of a person including the steps of: acquiring an image of a certain person; retrieving from a database a plurality of candidates having top N probabilities of being determined as the certain person and displaying the retrieved candidates on a screen of a terminal; providing a user with a pointing service capable of selecting a specific candidate among the displayed candidates; and attaching one or more tags to the image of the certain person by using tag information having been attached about the specific candidate who is selected by the user by the pointing service.
- a method for attaching tag information to an image of a person including the steps of: acquiring an image of a certain person; retrieving from a database a specific candidate having the highest probability of being determined as the certain person and displaying a name of the retrieved specific candidate near a facial region of the certain person on a screen of a terminal; retrieving from the database a plurality of next candidates having next highest probabilities of being determined as the certain person and displaying the retrieved candidates on the screen of the terminal; providing a user with a pointing service capable of selecting one among a candidate group including the specific candidate and the next candidates; and attaching one or more tags to the image of the certain person by using tag information having been attached about the selected one who is selected by the user by the pointing service.
- FIG. 1 shows a flow chart showing a method for tagging an image of a person in accordance with a first embodiment of the present invention.
- a device for tagging an image of a person acquires digital data including an image of a certain person in step S 110 .
- the digital data including the image of the certain person may be acquired by directly taking a picture of the certain person through a camera module built in the tagging device or by indirectly receiving it (or them) from other devices outside of the tagging device.
- the tagging device retrieves, from a database, a plurality of images of candidates having high probabilities of being determined as the certain person included in the acquired digital data and displays, e.g., the retrieved images of the candidates in step S 120 .
- a technique for providing a plurality of the candidates i.e., Top N list
- having the top N probabilities of being determined as the certain person included in the acquired digital data is disclosed in Korean Patent Application No. 10-2006-0077416 filed on Aug. 17, 2006 (which was also filed in PCT international application No. PCT/KR2006/004494 on Oct. 31, 2006) by the same applicant as that of the present invention, entitled “Methods for Tagging Person Identification Information to Digital Data and Recommending Additional Tag by Using Decision Fusion”.
- the database where the images of the candidates have been recorded may be included in the tagging device, but it may be provided outside of the tagging device. In the latter case, the tagging device may receive the images of the candidates from the database to display them.
- the tagging device After displaying the candidates, e.g., the images of the N candidates having the top N probabilities, the tagging device provides a user with a pointing service in step S 130 .
- the user may select a specific image among the images of the candidates by using the pointing service.
- the tagging device may attach one or more appropriate tags to the image of the certain person included in the acquired digital data by referring to one or more tags having been attached to the selected image, in step S 140 .
- the tags having been attached to the selected image include a name, a nickname, or other information
- the user may select the name or the nickname in order to attach one or more new tags to the image of the certain person included in the acquired digital data.
- the tagging device may attach other information, such as a mail address, a phone number and the like, to the image of the certain person, as additional tags, if selected by the user.
- FIG. 2 illustrates a part of a process included in the method in accordance with the first embodiment.
- the tagging device may provide a user with GUI, capable of selecting one person 230 among the plurality of persons to easily and selectively attach tagging information on the person 230 to the digital data 210 .
- GUI graphical user interface
- the person 230 is selected as shown in the picture in the right side of FIG. 2
- tagging information on the person 230 may be attached to the digital data 210 by using the convenient GUI provided by the tagging device.
- images (and/or names) of top N candidates having the top N probabilities of being determined as the person 230 may be displayed to embody the simple tagging process (Refer to FIG. 3 ).
- the tagging device may provide the user with the GUI capable of selecting a specified candidate among the displayed candidates in order to easily attach tag information on the specified candidate to the image.
- FIG. 3 provides the images of the N candidates in accordance with the first embodiment.
- the images (and/or the names) of nine candidates may be displayed in the form of 3*3 matrix on a screen.
- Displaying the images of the candidates in the form of 3*3 matrix enables the user to more easily select a specific candidate, in case an input unit of a mobile phone or a portable device is a sort of a keypad.
- an input unit of a mobile phone or a portable device is a sort of a keypad.
- the keys corresponding to numerals 1 to 9 in the keypad of the mobile phone are arranged in the form of 3*3 matrix, there is a one-to-one correspondence between the displayed images of the nine candidates and the keys of the keypad, so that the user can easily select any candidate by pressing an appropriate key.
- the keys in the keypad are arranged in the form of m*n matrix
- the images of the candidates may be also displayed in the form of m*n matrix in order to achieve a one-to-one correspondence therebetween.
- the user may press, e.g., a ‘0’ key to ignore it.
- the images of the candidates may be displayed along with their names. For example, an image of a first candidate is displayed along with a name of “Yumiko.”
- an image e.g., the image of the first candidate
- the image of the highlighted candidate may be also displayed in a separate region 311 .
- the tagging device provides the user with the pointing service so that the user can select any one of the displayed images of the candidates. That is, the user can change the location of the highlighted region by manipulating the keys in order to select any one of the displayed images of the candidates. For example, if the user presses a ‘2’ key at the time when the image of the first candidate is highlighted, the location of the highlighted region may be moved to a second candidate (an image of the second candidate, i.e., “Kumi”, becomes highlighted). Referring to a screen 320 on which the image of the second candidate is highlighted, the image of the second candidate may be also displayed in a separate region 321 .
- the user can select one of the candidates by directly pressing the corresponding numerical key, or by moving the highlighted region by manipulating arrow keys provided to most mobile phones. For example, at the time when the image of the first candidate is highlighted, the user may move the highlighted region to the image of the second candidate by pressing the right arrow key.
- FIG. 4 depicts a flow chart showing a method for tagging an image of a person in accordance with a second embodiment of the present invention.
- a tagging device e.g., a mobile phone or a portable device, acquires digital data including an image of a certain person in step S 410 .
- the digital data including the image of the certain person can be acquired by directly taking a picture of the certain person through a camera module built in the tagging device, or by indirectly receiving it (or them) from other devices outside of the tagging device.
- the tagging device may retrieve from a database a plurality of candidates, e.g., N candidates having the top N probabilities of being determined as the certain person included in the acquired digital data and then displays the retrieved images (and/or names) of the candidates in step S 420 .
- the database where the images of the candidates have been recorded may be included in the tagging device, but it may be provided outside of the tagging device. In the latter case, the tagging device may receive the images (and/or the names) of the candidates from the database in order to display them.
- the tagging device After displaying the images (and/or the names) of the N candidates having the top N probabilities, the tagging device provides a user with a pointing service in step S 430 .
- the user can select a desired image among the images of the candidates by using the pointing service.
- the tagging device may display images (and/or names) of a second group of N candidates having the next highest probabilities, i.e., from top (N+1) to top 2N probabilities.
- the user may select the specific candidate by manipulating keys.
- the tagging device may display images (and/or names) of a third group of N candidates having the next highest probabilities, i.e., from top (2N+1) to top 3N probabilities.
- the user may select the specific candidate by manipulating keys.
- N candidates etc. may be displayed for the choice.
- the user may press, e.g., a ‘List’ button to refer to the address book in step S 440 . If the desired person is considered to be included in the address book, the tagging device provides the user with the pointing service in step S 450 , so that the user can select the desired person.
- the user may determine the certain person included in the acquired digital data as a new person who has not been registered in the tagging device, and press a specific button, e.g., a ‘New’ button, to input the information on the new person (i.e., the certain person).
- a specific button e.g., a ‘New’ button
- the tagging device attaches tag information on the desired person to the acquired image of the certain person in step S 460 .
- tag information on the desired person For example, a name, a mail address, a phone number, etc. of the desired person may become the tag of the image of the certain person.
- the user may press a specific button, e.g., an ‘Ignore’ button, to delete it.
- a specific button e.g., an ‘Ignore’ button
- FIG. 5 provides a flow chart showing a method for tagging an image of a person in accordance with a third embodiment of the present invention.
- a tagging device (for example, a mobile phone or a portable device) acquires digital data including an image of a certain person in step S 510 .
- the digital data including the image of the certain person can be acquired by directly taking a picture of the person through a camera module built in the tagging device, or by indirectly receiving it (or them) from other devices outside of the tagging device.
- the tagging device retrieves, from a database, a candidate having a highest probability of being determined as the certain person included in the acquired digital data and displays the name of the retrieved candidate near a facial image of the certain person in step S 520 (refer to “Mayumi” in FIG. 6 ).
- the name displayed near the facial image is selected in the end, the name may be conveniently attached as a tag to the image of the certain person.
- the tagging device retrieves, from the database, M candidates having next highest probabilities of being determined as the certain person included in the acquired image and displays the M candidates below the acquired image in step S 530 (refer to Yumiko, Kumi, Sara and the like in a region 612 of FIG. 6 ). If the name displayed near the facial image of the certain person in step S 520 is considered to be incorrect, the user may select a desired one from the displayed M candidates in step S 530 . In accordance with another embodiment of the present invention, Mayumi, who has the highest probability of being determined as the certain person, may be displayed together with Yumiko and Kumi, in the region 612 .
- the database may be provided either inside or outside the tagging device.
- the tagging device After displaying information, e.g., the name and/or the facial images, on the candidates in the region 612 , the tagging device provides the user with a pointing service in step S 540 , so that the user can select a desired person among the candidates. In case the user selects the desired person by using the pointing service, the tagging device attaches tag information on the desired person to the acquired image of the certain person in step S 550 .
- FIG. 6 illustrates a part of the process included in the method in accordance with the third embodiment.
- the candidates having the high probabilities of being determined as each of the persons included in the acquired digital data are displayed.
- the images (and/or the names) of the six candidates may be displayed in the form of 2*3 matrix as shown in FIG. 6 .
- the images (and/or the names) of the candidates may be displayed in the form of p*q matrix in accordance with another embodiment of the present invention.
- the arrangement of the images (and/or the names) of the candidates may satisfy a one-to-one correspondence with that of the keys.
- the tagging device may provide the user with the pointing service so that the user can select a desired candidate among the candidates.
- the user may press the corresponding numerical key or move a highlighted region by manipulating arrow keys provided to the keypad.
- the user can press the arrow keys to display other candidates.
- an image (and/or a name) of another candidate may be provided from the bottom right side one by one, whenever the user presses, e.g., the right arrow key.
- an image (and/or a name) of the candidate of a high priority which has disappeared from the screen may appear on the screen one by one, whenever the user presses, e.g., the left arrow key.
- the functions of the left and the right arrow keys can be swapped.
- FIG. 6 there is provided a specific image of one man and one woman.
- the description of the GUI has been focused on the tagging process about the woman whose facial area is highlighted, but the tagging process can also be applied to the man in the same manner if his facial area is highlighted.
- frames may be automatically set around the man's facial area and the woman's facial area. For example, if the user selects a right frame including the woman's face, to be tagged first by activating it by manipulating the keys, images and/or names of candidates, having high probabilities of being determined as the woman, are provided to help the user to easily attach one or more tags about the woman.
- the user may move a cursor to a left frame including the man's face to attach one or more tags about the man.
- the candidates having high probabilities of being determined as the man may be provided to the region 612 so that the user can easily select a desired candidate.
- the region 612 is changed into a region 622 as shown in the right side of FIG. 6 .
- the user can select the ‘New’ key to give a new name to a face 621 , as shown in the right side of FIG. 6 .
- FIG. 7 illustrates a part of the process included in the method in accordance with the third embodiment.
- the region 622 where the images (and/or the names) of the candidates are displayed disappears from the screen, and instead, a region 730 for inputting a new name may be displayed on a screen 700 .
- the user may insert the name of a person 710 by manually inputting the name in the region 730 .
- the user may insert the name of a person 720 by manually inputting the name in the region 730 .
Abstract
A method for attaching tag information to an image of a person, includes the steps of: acquiring an image of a certain person; retrieving from a database a plurality of candidates having top N probabilities of being determined as the certain person and displaying the retrieved candidates on a screen of a terminal; providing a user with a pointing service capable of selecting a specific candidate among the displayed candidates; and attaching one or more tags to the image of the certain person by using tag information having been attached about the specific candidate who is selected by the user by the pointing service. As a result, the GUI may be provided to a mobile phone, capable of assisting the user to tag the image of the certain person easily. Therefore, the image of the certain person can be easily classified and searched by using the tag attached thereto.
Description
- The present invention relates to a method for tagging an image of a person with ease.
- In recent years, much research has been conducted on image search methods, among which the search of an image of a person (a portrait) is of a great use and, for this search service, it is necessary to adequately tag an image of a person. For example, the tagging information on an image of a certain person may include not only a name of the certain person but also a nickname, a mail address of the certain person and the like.
- The tendency toward digital convergence provides various multimedia functions to a mobile phone or other portable devices, which may have small-sized keys so that it is difficult to input various texts by manipulating the keys. For example, in case of the mobile phone, twelve keys may be used to input English, Korean, numbers, special characters and the like.
- Even though the tagging information on an image may be variously determined in general, the tagging information on an image of a person may be restrictively determined. For example, the tagging information, e.g., the name, the nickname and other information on a certain person included in the image, attached to the image, may be used to classify and search the image with ease.
- In order to easily classify an image of a person, a technique for tagging the image of the person may be required. To this end, it is also necessary to develop a Graphic User Interface (GUI), capable of helping a user to tag the image of a person in comfort.
- It is, therefore, one object of the present invention to provide a user-friendly Graphic User Interface (GUI) capable of helping a user to tag an image of a person with ease.
- In accordance with exemplary embodiments of the present invention, there is provided the GUI of a mobile phone or other portable devices, capable of easily assisting a user to tag an image of a person. The image of the person can be easily classified and searched by using the tag attached thereto.
- The above and other objects and features of the present invention will become apparent from the following description of preferred embodiments given in conjunction with the accompanying drawings, in which:
-
FIG. 1 shows a flow chart of a method for tagging an image of a person in accordance with a first embodiment of the present invention; -
FIG. 2 illustrates a part of a process included in the method in accordance with the first embodiment; -
FIG. 3 provides the images of the N candidates in accordance with the first embodiment; -
FIG. 4 depicts a flow chart showing a method for tagging an image of a person in accordance with a second embodiment of the present invention; -
FIG. 5 provides a flow chart showing a method of tagging an image of a person in accordance with a third embodiment of the present invention; -
FIG. 6 illustrates a part of the process included in the method in accordance with the third embodiment; and -
FIG. 7 illustrates a part of the process included in the method in accordance with the third embodiment. - In accordance with one aspect of the present invention, there is provided a method for attaching tag information to an image of a person, including the steps of: acquiring an image of a certain person; retrieving from a database a plurality of candidates having top N probabilities of being determined as the certain person and displaying the retrieved candidates on a screen of a terminal; providing a user with a pointing service capable of selecting a specific candidate among the displayed candidates; and attaching one or more tags to the image of the certain person by using tag information having been attached about the specific candidate who is selected by the user by the pointing service.
- In accordance with another aspect of the present invention, there is provided a method for attaching tag information to an image of a person, including the steps of: acquiring an image of a certain person; retrieving from a database a specific candidate having the highest probability of being determined as the certain person and displaying a name of the retrieved specific candidate near a facial region of the certain person on a screen of a terminal; retrieving from the database a plurality of next candidates having next highest probabilities of being determined as the certain person and displaying the retrieved candidates on the screen of the terminal; providing a user with a pointing service capable of selecting one among a candidate group including the specific candidate and the next candidates; and attaching one or more tags to the image of the certain person by using tag information having been attached about the selected one who is selected by the user by the pointing service.
- In the following detailed description, reference is made to the accompanying drawings that show, by way of illustration, specific embodiments in which the present invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present invention. It is to be understood that the various embodiments of the present invention, although different from one another, are not necessarily mutually exclusive. For example, a particular feature, structure, or characteristic described herein in connection with one embodiment may be implemented within other embodiments without departing from the spirit and scope of the present invention. In addition, it is to be understood that the location or arrangement of individual elements within each disclosed embodiment may be modified without departing from the spirit and scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims, appropriately interpreted, along with the full range of equivalents to which the claims are entitled. In the drawings, like numerals refer to the same or similar functionality throughout the several views.
- The present invention will now be described in more detail, with reference to the accompanying drawings.
-
FIG. 1 shows a flow chart showing a method for tagging an image of a person in accordance with a first embodiment of the present invention. - A device for tagging an image of a person (hereinafter, referred to as ‘the tagging device’), e.g., a mobile phone or a portable device, acquires digital data including an image of a certain person in step S110. The digital data including the image of the certain person may be acquired by directly taking a picture of the certain person through a camera module built in the tagging device or by indirectly receiving it (or them) from other devices outside of the tagging device.
- If the image of the certain person is acquired, the tagging device retrieves, from a database, a plurality of images of candidates having high probabilities of being determined as the certain person included in the acquired digital data and displays, e.g., the retrieved images of the candidates in step S120. A technique for providing a plurality of the candidates (i.e., Top N list), having the top N probabilities of being determined as the certain person included in the acquired digital data, is disclosed in Korean Patent Application No. 10-2006-0077416 filed on Aug. 17, 2006 (which was also filed in PCT international application No. PCT/KR2006/004494 on Oct. 31, 2006) by the same applicant as that of the present invention, entitled “Methods for Tagging Person Identification Information to Digital Data and Recommending Additional Tag by Using Decision Fusion”.
- Herein, the database where the images of the candidates have been recorded may be included in the tagging device, but it may be provided outside of the tagging device. In the latter case, the tagging device may receive the images of the candidates from the database to display them.
- After displaying the candidates, e.g., the images of the N candidates having the top N probabilities, the tagging device provides a user with a pointing service in step S130. The user may select a specific image among the images of the candidates by using the pointing service.
- After the user selects the specific image, the tagging device may attach one or more appropriate tags to the image of the certain person included in the acquired digital data by referring to one or more tags having been attached to the selected image, in step S140. For example, if the tags having been attached to the selected image include a name, a nickname, or other information, the user may select the name or the nickname in order to attach one or more new tags to the image of the certain person included in the acquired digital data. Further, the tagging device may attach other information, such as a mail address, a phone number and the like, to the image of the certain person, as additional tags, if selected by the user.
- In case there are no images of some candidates stored in the database though they are included in the top N list, like Sara and Dave illustrated in
FIG. 3 , their names only may be displayed in the Top N list. They may be considered to have high probabilities of being determined as the certain person by referring to a life pattern, a text message, etc. thereof (See Korean Application No. 10-2006-0077416). -
FIG. 2 illustrates a part of a process included in the method in accordance with the first embodiment. - If a plurality of persons are included in an acquired
digital data 210, the tagging device may provide a user with GUI, capable of selecting oneperson 230 among the plurality of persons to easily and selectively attach tagging information on theperson 230 to thedigital data 210. If theperson 230 is selected as shown in the picture in the right side ofFIG. 2 , tagging information on theperson 230 may be attached to thedigital data 210 by using the convenient GUI provided by the tagging device. In this case, images (and/or names) of top N candidates having the top N probabilities of being determined as theperson 230 may be displayed to embody the simple tagging process (Refer toFIG. 3 ). As described above, even if a plurality of persons are included in the image, the tagging device may provide the user with the GUI capable of selecting a specified candidate among the displayed candidates in order to easily attach tag information on the specified candidate to the image. -
FIG. 3 provides the images of the N candidates in accordance with the first embodiment. - For example, as illustrated in
FIG. 3 , the images (and/or the names) of nine candidates may be displayed in the form of 3*3 matrix on a screen. Displaying the images of the candidates in the form of 3*3 matrix enables the user to more easily select a specific candidate, in case an input unit of a mobile phone or a portable device is a sort of a keypad. For example, if the keys corresponding to numerals 1 to 9 in the keypad of the mobile phone are arranged in the form of 3*3 matrix, there is a one-to-one correspondence between the displayed images of the nine candidates and the keys of the keypad, so that the user can easily select any candidate by pressing an appropriate key. - Further, if the keys in the keypad are arranged in the form of m*n matrix, the images of the candidates may be also displayed in the form of m*n matrix in order to achieve a one-to-one correspondence therebetween.
- In the mean time, if no image of a person is included in the digital data or if an image of a thing is incorrectly recognized as an image of a person, the user may press, e.g., a ‘0’ key to ignore it.
- As illustrated in
FIG. 3 , the images of the candidates may be displayed along with their names. For example, an image of a first candidate is displayed along with a name of “Yumiko.” - Moreover, as shown in a
screen 310 of the tagging device, an image, e.g., the image of the first candidate, may be highlighted. Further, the image of the highlighted candidate may be also displayed in aseparate region 311. - The tagging device provides the user with the pointing service so that the user can select any one of the displayed images of the candidates. That is, the user can change the location of the highlighted region by manipulating the keys in order to select any one of the displayed images of the candidates. For example, if the user presses a ‘2’ key at the time when the image of the first candidate is highlighted, the location of the highlighted region may be moved to a second candidate (an image of the second candidate, i.e., “Kumi”, becomes highlighted). Referring to a
screen 320 on which the image of the second candidate is highlighted, the image of the second candidate may be also displayed in aseparate region 321. - As described above, the user can select one of the candidates by directly pressing the corresponding numerical key, or by moving the highlighted region by manipulating arrow keys provided to most mobile phones. For example, at the time when the image of the first candidate is highlighted, the user may move the highlighted region to the image of the second candidate by pressing the right arrow key.
-
FIG. 4 depicts a flow chart showing a method for tagging an image of a person in accordance with a second embodiment of the present invention. - A tagging device, e.g., a mobile phone or a portable device, acquires digital data including an image of a certain person in step S410. As described in the method of tagging an image of a person as shown in
FIG. 1 , the digital data including the image of the certain person can be acquired by directly taking a picture of the certain person through a camera module built in the tagging device, or by indirectly receiving it (or them) from other devices outside of the tagging device. - If the digital data including the image of the certain person is acquired, the tagging device may retrieve from a database a plurality of candidates, e.g., N candidates having the top N probabilities of being determined as the certain person included in the acquired digital data and then displays the retrieved images (and/or names) of the candidates in step S420. The database where the images of the candidates have been recorded may be included in the tagging device, but it may be provided outside of the tagging device. In the latter case, the tagging device may receive the images (and/or the names) of the candidates from the database in order to display them.
- After displaying the images (and/or the names) of the N candidates having the top N probabilities, the tagging device provides a user with a pointing service in step S430. The user can select a desired image among the images of the candidates by using the pointing service.
- However, in case none of the displayed candidates is considered to be identical with the certain person, the tagging device may display images (and/or names) of a second group of N candidates having the next highest probabilities, i.e., from top (N+1) to top 2N probabilities.
- In case a specific candidate among the second group of the N candidates is considered to be the certain person, the user may select the specific candidate by manipulating keys. However, in case none of the displayed second group of the N candidates is considered to be identical with the certain person, the tagging device may display images (and/or names) of a third group of N candidates having the next highest probabilities, i.e., from top (2N+1) to top 3N probabilities.
- In case a specific candidate among the third group of the N candidates is considered to be the certain person, the user may select the specific candidate by manipulating keys.
- Otherwise, images (and/or names) of a fourth group of N candidates, a fifth group of
- N candidates etc. may be displayed for the choice.
- However, in case none of the displayed candidates is considered to be the certain person even though all images of the candidates are retrieved, it is necessary to retrieve a desired person among the candidates registered only in an address book, a phone book and the like. In detail, the user may press, e.g., a ‘List’ button to refer to the address book in step S440. If the desired person is considered to be included in the address book, the tagging device provides the user with the pointing service in step S450, so that the user can select the desired person. However, if there is no desired person in the address book, the user may determine the certain person included in the acquired digital data as a new person who has not been registered in the tagging device, and press a specific button, e.g., a ‘New’ button, to input the information on the new person (i.e., the certain person). This manipulation of the keys may be applied to other embodiments even though any specific description thereabout is not presented.
- If the user selects the desired person, the tagging device attaches tag information on the desired person to the acquired image of the certain person in step S460. For example, a name, a mail address, a phone number, etc. of the desired person may become the tag of the image of the certain person.
- If the user attached an incorrect tag to the image of the certain person or wants to delete a tag, the user may press a specific button, e.g., an ‘Ignore’ button, to delete it. This manipulation of the keys may be also applied to other embodiments even though any specific description thereabout is not presented.
-
FIG. 5 provides a flow chart showing a method for tagging an image of a person in accordance with a third embodiment of the present invention. - A tagging device (for example, a mobile phone or a portable device) acquires digital data including an image of a certain person in step S510. As described in the embodiments of
FIGS. 1 and 4 , the digital data including the image of the certain person can be acquired by directly taking a picture of the person through a camera module built in the tagging device, or by indirectly receiving it (or them) from other devices outside of the tagging device. - If the digital data including the image of the certain person is acquired, the tagging device retrieves, from a database, a candidate having a highest probability of being determined as the certain person included in the acquired digital data and displays the name of the retrieved candidate near a facial image of the certain person in step S520 (refer to “Mayumi” in
FIG. 6 ). Herein, if the name displayed near the facial image is selected in the end, the name may be conveniently attached as a tag to the image of the certain person. - Moreover, the tagging device retrieves, from the database, M candidates having next highest probabilities of being determined as the certain person included in the acquired image and displays the M candidates below the acquired image in step S530 (refer to Yumiko, Kumi, Sara and the like in a
region 612 ofFIG. 6 ). If the name displayed near the facial image of the certain person in step S520 is considered to be incorrect, the user may select a desired one from the displayed M candidates in step S530. In accordance with another embodiment of the present invention, Mayumi, who has the highest probability of being determined as the certain person, may be displayed together with Yumiko and Kumi, in theregion 612. - As described in the embodiments of
FIGS. 1 and 4 , the database may be provided either inside or outside the tagging device. - After displaying information, e.g., the name and/or the facial images, on the candidates in the
region 612, the tagging device provides the user with a pointing service in step S540, so that the user can select a desired person among the candidates. In case the user selects the desired person by using the pointing service, the tagging device attaches tag information on the desired person to the acquired image of the certain person in step S550. -
FIG. 6 illustrates a part of the process included in the method in accordance with the third embodiment. - Referring to a
screen 610 ofFIG. 6 , the candidates having the high probabilities of being determined as each of the persons included in the acquired digital data are displayed. Herein, since there is no sufficient room for displaying nine candidates in theregion 612 due to the space occupied by the acquired digital data, only six candidates can be displayed, unlikeFIG. 3 . That is, the images (and/or the names) of the six candidates may be displayed in the form of 2*3 matrix as shown inFIG. 6 . - Further, the images (and/or the names) of the candidates may be displayed in the form of p*q matrix in accordance with another embodiment of the present invention. Herein, the arrangement of the images (and/or the names) of the candidates may satisfy a one-to-one correspondence with that of the keys.
- Likewise, the tagging device may provide the user with the pointing service so that the user can select a desired candidate among the candidates.
- To select the desired candidate among, e.g., the six candidates displayed in the
region 612, the user may press the corresponding numerical key or move a highlighted region by manipulating arrow keys provided to the keypad. In case there is no desired candidate included in the displayed six candidates, the user can press the arrow keys to display other candidates. For example, an image (and/or a name) of another candidate may be provided from the bottom right side one by one, whenever the user presses, e.g., the right arrow key. Furthermore, an image (and/or a name) of the candidate of a high priority which has disappeared from the screen may appear on the screen one by one, whenever the user presses, e.g., the left arrow key. Herein, it should be noted that the functions of the left and the right arrow keys can be swapped. - In detail, in
FIG. 6 , there is provided a specific image of one man and one woman. - Hereinbefore, the description of the GUI has been focused on the tagging process about the woman whose facial area is highlighted, but the tagging process can also be applied to the man in the same manner if his facial area is highlighted.
- Referring to
FIG. 6 , frames may be automatically set around the man's facial area and the woman's facial area. For example, if the user selects a right frame including the woman's face, to be tagged first by activating it by manipulating the keys, images and/or names of candidates, having high probabilities of being determined as the woman, are provided to help the user to easily attach one or more tags about the woman. - After completing the tagging process about the woman, the user may move a cursor to a left frame including the man's face to attach one or more tags about the man. Herein, if the cursor is moved to the left frame including the man's face, the candidates having high probabilities of being determined as the man may be provided to the
region 612 so that the user can easily select a desired candidate. - Meanwhile, if the user presses, e.g., the left arrow key twice when the candidates having the top N probabilities are displayed on the
screen 610 as shown in the left side ofFIG. 6 , theregion 612 is changed into aregion 622 as shown in the right side ofFIG. 6 . Herein, the user can select the ‘New’ key to give a new name to aface 621, as shown in the right side ofFIG. 6 . -
FIG. 7 illustrates a part of the process included in the method in accordance with the third embodiment. - Referring to
FIG. 7 , when the user presses the ‘New’ key, theregion 622 where the images (and/or the names) of the candidates are displayed disappears from the screen, and instead, aregion 730 for inputting a new name may be displayed on ascreen 700. The user may insert the name of aperson 710 by manually inputting the name in theregion 730. Likewise, the user may insert the name of aperson 720 by manually inputting the name in theregion 730. - While the present invention has been shown and described with respect to the preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and the scope of the present invention as defined in the following claims.
Claims (23)
1. A method for attaching tag information to an electronic image of a person, comprising the steps of:
acquiring an electronic image of a person;
retrieving from a database a plurality of candidates having top N probabilities of being determined as the person and displaying the retrieved candidates on a screen of a user, where N is an integer equal to or larger than 1;
providing the user with a pointing service capable of selecting a candidate among the displayed candidates; and
upon receipt of a selection of the candidate from the user, attaching one or more tags to the electronic image of the person by using tag information attached about the selected candidate.
2. The method of claim 1 , wherein the step of acquiring the electronic image of the person comprises the step of specifying the person among a plurality of persons included in the electronic image.
3. The method of claim 1 , wherein the step of displaying the retrieved candidates comprises the step of displaying images or names of the retrieved candidates.
4. The method of claim 1 , wherein the step of displaying the retrieved candidates comprises the step of displaying the retrieved N candidates in the form of m*n matrix on the screen of the terminal such that the arrangement of the retrieved N candidates is one-to-one correspondence with the keys in the keypad in the terminal, if the keys of the keypad are arranged in the form of m*n matrix.
5. (canceled)
6. The method of claim 1 , wherein the step of providing the user with the pointing service comprises the step of: moving a position of a highlighted region including one candidate among the N candidates by manipulating keys, until the highlighted region includes the candidate.
7. The method of claim 1 , further comprising the steps of:
retrieving candidates from an address book and displaying the retrieved candidates by manipulating the key; and
providing the user with the pointing service capable of selecting the candidate among the retrieved candidates.
8. The method of claim 7 , wherein, in case there is no candidate in the database, the candidates are retrieved from the address book.
9. The method of claim 8 , wherein, in case there is no candidate in the address book, the person included in the acquired image is considered to be a new person who has not been registered in the database or the address book, and tag information for the certain person is manually inserted by manipulating keys.
10. The method of claim 1 , wherein the tags attached to the electronic image of the person include at least one of a name, a nickname, an address, and a telephone number of the person.
11. The method of claim 1 , wherein, in case the tags are incorrectly attached to the electronic image of the person, the tags are deleted by manipulating keys.
12. A method for attaching tag information to an electronic image of a person, comprising the steps of:
acquiring an electronic image;
retrieving from a database a candidate having the highest probability of being determined as a person and displaying a name of the retrieved candidate near a facial region of the person on a screen of a terminal;
retrieving from the database a plurality of next candidates having next highest probabilities of being determined as the person and displaying the retrieved candidates on the screen of the terminal;
providing a user with a pointing service capable of selecting one among a candidate group including the candidate and the next candidates; and
attaching one or more tags to the electronic image of the person by using tag information having been attached about the selected one who is selected by the user by the pointing service.
13. The method of claim 12 , wherein the step of displaying the retrieved next candidates displays candidates having the top N probabilities of being determined as the person except the candidate having the highest probability.
14. The method of claim 12 , wherein the step of displaying the retrieved next candidates displays candidates having the top N probabilities of being determined as the person including the candidate having the top 1 probability.
15. The method of claim 14 , wherein the step of displaying the retrieved next candidates displays candidates in the form of p*q matrix below the acquired image.
16. The method of claim 15 , wherein keys of a keypad in the terminal are arranged in the form of p*q matrix, and the arrangement of the displayed next candidates is one-to-one correspondence with that of the keys.
17. (canceled)
18. The method of claim 12 , further comprising the steps of:
retrieving candidates from an address book and displaying the retrieved candidates by manipulating the key; and
providing the user with the pointing service capable of selecting one among the retrieved candidates.
19. The method of claim 18 , wherein, in case there is no one selected in the database, the candidates are retrieved from the address book.
20. The method of claim 19 , wherein, in case there is no one selected in the address book, the person included in the acquired image is considered to be a new person who has not been registered in the database or the address book, and tag information for the certain person is manually inserted by manipulating keys.
21. The method of claim 12 , wherein the tags attached to the electronic image of the person include at least one of a name, a nickname, an address, and a telephone number of the person.
22. (canceled)
23. One or more computer-readable media having stored thereon a computer program that, when executed by one or more processors, causes the one or more processors to perform acts including:
acquiring an electronic image of a person;
retrieving from a database a plurality of candidates having top N probabilities of being determined as the person and displaying the retrieved candidates on a screen of a user, where N is an integer equal to or larger than 1;
providing the user with a pointing service capable of selecting a candidate among the displayed candidates; and
upon receipt of a selection of the candidate from the user, attaching one or more tags to the electronic image of the person by using tag information attached about the selected.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2007-0013038 | 2007-02-08 | ||
KR1020070013038A KR100796044B1 (en) | 2007-02-08 | 2007-02-08 | Method for tagging a person image |
PCT/KR2008/000755 WO2008097049A1 (en) | 2007-02-08 | 2008-02-05 | Method for attaching tag to image of person |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100318510A1 true US20100318510A1 (en) | 2010-12-16 |
Family
ID=39218549
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/526,282 Abandoned US20100318510A1 (en) | 2007-02-08 | 2008-02-05 | Method for attaching tag to image of person |
Country Status (5)
Country | Link |
---|---|
US (1) | US20100318510A1 (en) |
EP (1) | EP2118849A4 (en) |
JP (1) | JP2010518505A (en) |
KR (1) | KR100796044B1 (en) |
WO (1) | WO2008097049A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090288028A1 (en) * | 2008-05-19 | 2009-11-19 | Canon Kabushiki Kaisha | Apparatus and method for managing content |
US20100054601A1 (en) * | 2008-08-28 | 2010-03-04 | Microsoft Corporation | Image Tagging User Interface |
WO2012149332A2 (en) * | 2011-04-29 | 2012-11-01 | Facebook, Inc. | Dynamic tagging recommendation |
US20130151609A1 (en) * | 2011-12-09 | 2013-06-13 | Yigal Dan Rubinstein | Content Report Management in a Social Networking System |
WO2014025185A1 (en) * | 2012-08-06 | 2014-02-13 | Samsung Electronics Co., Ltd. | Method and system for tagging information about image, apparatus and computer-readable recording medium thereof |
US8824748B2 (en) | 2010-09-24 | 2014-09-02 | Facebook, Inc. | Auto tagging in geo-social networking system |
US8856922B2 (en) | 2011-11-30 | 2014-10-07 | Facebook, Inc. | Imposter account report management in a social networking system |
US20150089396A1 (en) * | 2013-09-25 | 2015-03-26 | Kairos Social Solutions, Inc. | Device, System, and Method of Identifying a specific user from a profile image containing multiple people |
US9020183B2 (en) | 2008-08-28 | 2015-04-28 | Microsoft Technology Licensing, Llc | Tagging images with labels |
US20150227609A1 (en) * | 2014-02-13 | 2015-08-13 | Yahoo! Inc. | Automatic group formation and group detection through media recognition |
US20150278207A1 (en) * | 2014-03-31 | 2015-10-01 | Samsung Electronics Co., Ltd. | Electronic device and method for acquiring image data |
US9317530B2 (en) | 2011-03-29 | 2016-04-19 | Facebook, Inc. | Face recognition based on spatial and temporal proximity |
US9858298B1 (en) * | 2013-07-11 | 2018-01-02 | Facebook, Inc. | Methods and systems for using hints in media content tagging |
US11899730B2 (en) * | 2022-05-19 | 2024-02-13 | Sgs Ventures Inc. | System and method for managing relationships, organization, retrieval, and sharing of different types of contents accessible by a computing device |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5526620B2 (en) * | 2009-06-25 | 2014-06-18 | 株式会社ニコン | Digital camera |
KR101671375B1 (en) * | 2009-12-18 | 2016-11-01 | 한국전자통신연구원 | Method For Searching Imageby user interface and electric device with the user interface |
US10382438B2 (en) | 2010-05-27 | 2019-08-13 | Nokia Technologies Oy | Method and apparatus for expanded content tag sharing |
CN102868818A (en) * | 2012-09-10 | 2013-01-09 | 韩洪波 | Mobile phone capable of presetting photo display character and time |
KR102084564B1 (en) * | 2013-05-15 | 2020-03-04 | 주식회사 엘지유플러스 | System and method for photo sharing by face recognition |
CN108197132B (en) * | 2017-10-09 | 2022-02-08 | 国网陕西省电力公司 | Graph database-based electric power asset portrait construction method and device |
JP7308421B2 (en) | 2018-07-02 | 2023-07-14 | パナソニックIpマネジメント株式会社 | LEARNING DEVICE, LEARNING SYSTEM AND LEARNING METHOD |
JP6810359B2 (en) * | 2018-11-22 | 2021-01-06 | キヤノンマーケティングジャパン株式会社 | Information processing device, control method, program |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5807256A (en) * | 1993-03-01 | 1998-09-15 | Kabushiki Kaisha Toshiba | Medical information processing system for supporting diagnosis |
US6362817B1 (en) * | 1998-05-18 | 2002-03-26 | In3D Corporation | System for creating and viewing 3D environments using symbolic descriptors |
US20030122839A1 (en) * | 2001-12-26 | 2003-07-03 | Eastman Kodak Company | Image format including affective information |
US20030151676A1 (en) * | 2001-12-28 | 2003-08-14 | Kazuyuki Seki | Input apparatus for image |
US20040008873A1 (en) * | 2002-05-24 | 2004-01-15 | Koji Sogo | Face collation apparatus and biometrics data collation apparatus |
US20040073543A1 (en) * | 2002-10-14 | 2004-04-15 | Samsung Electronics Co., Ltd. | Image retrieval method and apparatus using iterative matching |
US20040264780A1 (en) * | 2003-06-30 | 2004-12-30 | Lei Zhang | Face annotation for photo management |
US20050022114A1 (en) * | 2001-08-13 | 2005-01-27 | Xerox Corporation | Meta-document management system with personality identifiers |
US6956576B1 (en) * | 2000-05-16 | 2005-10-18 | Sun Microsystems, Inc. | Graphics system using sample masks for motion blur, depth of field, and transparency |
US20060239515A1 (en) * | 2005-04-21 | 2006-10-26 | Microsoft Corporation | Efficient propagation for face annotation |
US20060251339A1 (en) * | 2005-05-09 | 2006-11-09 | Gokturk Salih B | System and method for enabling the use of captured images through recognition |
US20070020678A1 (en) * | 2002-10-30 | 2007-01-25 | Dana Ault-Riche | Methods for producing polypeptide-tagged collections and capture systems containing the tagged polypeptides |
US20070239683A1 (en) * | 2006-04-07 | 2007-10-11 | Eastman Kodak Company | Identifying unique objects in multiple image collections |
US20080131073A1 (en) * | 2006-07-04 | 2008-06-05 | Sony Corporation | Information processing apparatus and method, and program |
US20090052862A1 (en) * | 2005-09-22 | 2009-02-26 | Jonathan El Bowes | Search tool |
US7587068B1 (en) * | 2004-01-22 | 2009-09-08 | Fotonation Vision Limited | Classification database for consumer digital images |
US20090254537A1 (en) * | 2005-12-22 | 2009-10-08 | Matsushita Electric Industrial Co., Ltd. | Image search apparatus and image search method |
US7634662B2 (en) * | 2002-11-21 | 2009-12-15 | Monroe David A | Method for incorporating facial recognition technology in a multimedia surveillance system |
US7715597B2 (en) * | 2004-12-29 | 2010-05-11 | Fotonation Ireland Limited | Method and component for image recognition |
US7978936B1 (en) * | 2006-01-26 | 2011-07-12 | Adobe Systems Incorporated | Indicating a correspondence between an image and an object |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AUPQ717700A0 (en) * | 2000-04-28 | 2000-05-18 | Canon Kabushiki Kaisha | A method of annotating an image |
KR100437447B1 (en) * | 2000-12-01 | 2004-06-25 | (주)아이펜텍 | A text tagging method and a recording medium |
JP2003281157A (en) * | 2002-03-19 | 2003-10-03 | Toshiba Corp | Person retrieval system, person tracing system, person retrieval method and person tracing method |
US7843495B2 (en) * | 2002-07-10 | 2010-11-30 | Hewlett-Packard Development Company, L.P. | Face recognition in a digital imaging system accessing a database of people |
JP2004086625A (en) | 2002-08-27 | 2004-03-18 | Hitoshi Hongo | Customer information managing device |
JP2004252883A (en) * | 2003-02-21 | 2004-09-09 | Canon Inc | Determination device |
JP4603778B2 (en) | 2003-06-20 | 2010-12-22 | キヤノン株式会社 | Image display method and image display apparatus |
JP2005149068A (en) * | 2003-11-14 | 2005-06-09 | Aruze Corp | System for confirming person involved |
US7822233B2 (en) * | 2003-11-14 | 2010-10-26 | Fujifilm Corporation | Method and apparatus for organizing digital media based on face recognition |
JP2005175597A (en) * | 2003-12-08 | 2005-06-30 | Nikon Corp | Electronic camera |
JP2006229289A (en) * | 2005-02-15 | 2006-08-31 | Konica Minolta Photo Imaging Inc | Imaging apparatus and data communication system |
WO2007011709A2 (en) * | 2005-07-18 | 2007-01-25 | Youfinder Intellectual Property Licensing Limited Liability Company | Manually-assisted automated indexing of images using facial recognition |
KR100641791B1 (en) * | 2006-02-14 | 2006-11-02 | (주)올라웍스 | Tagging Method and System for Digital Data |
-
2007
- 2007-02-08 KR KR1020070013038A patent/KR100796044B1/en not_active IP Right Cessation
-
2008
- 2008-02-05 EP EP08712406A patent/EP2118849A4/en not_active Ceased
- 2008-02-05 JP JP2009548999A patent/JP2010518505A/en active Pending
- 2008-02-05 US US12/526,282 patent/US20100318510A1/en not_active Abandoned
- 2008-02-05 WO PCT/KR2008/000755 patent/WO2008097049A1/en active Application Filing
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5807256A (en) * | 1993-03-01 | 1998-09-15 | Kabushiki Kaisha Toshiba | Medical information processing system for supporting diagnosis |
US6362817B1 (en) * | 1998-05-18 | 2002-03-26 | In3D Corporation | System for creating and viewing 3D environments using symbolic descriptors |
US6956576B1 (en) * | 2000-05-16 | 2005-10-18 | Sun Microsystems, Inc. | Graphics system using sample masks for motion blur, depth of field, and transparency |
US20050022114A1 (en) * | 2001-08-13 | 2005-01-27 | Xerox Corporation | Meta-document management system with personality identifiers |
US20030122839A1 (en) * | 2001-12-26 | 2003-07-03 | Eastman Kodak Company | Image format including affective information |
US20030151676A1 (en) * | 2001-12-28 | 2003-08-14 | Kazuyuki Seki | Input apparatus for image |
US20040008873A1 (en) * | 2002-05-24 | 2004-01-15 | Koji Sogo | Face collation apparatus and biometrics data collation apparatus |
US20040073543A1 (en) * | 2002-10-14 | 2004-04-15 | Samsung Electronics Co., Ltd. | Image retrieval method and apparatus using iterative matching |
US20070020678A1 (en) * | 2002-10-30 | 2007-01-25 | Dana Ault-Riche | Methods for producing polypeptide-tagged collections and capture systems containing the tagged polypeptides |
US7634662B2 (en) * | 2002-11-21 | 2009-12-15 | Monroe David A | Method for incorporating facial recognition technology in a multimedia surveillance system |
US20040264780A1 (en) * | 2003-06-30 | 2004-12-30 | Lei Zhang | Face annotation for photo management |
US7587068B1 (en) * | 2004-01-22 | 2009-09-08 | Fotonation Vision Limited | Classification database for consumer digital images |
US7715597B2 (en) * | 2004-12-29 | 2010-05-11 | Fotonation Ireland Limited | Method and component for image recognition |
US20060239515A1 (en) * | 2005-04-21 | 2006-10-26 | Microsoft Corporation | Efficient propagation for face annotation |
US20060251339A1 (en) * | 2005-05-09 | 2006-11-09 | Gokturk Salih B | System and method for enabling the use of captured images through recognition |
US7519200B2 (en) * | 2005-05-09 | 2009-04-14 | Like.Com | System and method for enabling the use of captured images through recognition |
US20090052862A1 (en) * | 2005-09-22 | 2009-02-26 | Jonathan El Bowes | Search tool |
US20090254537A1 (en) * | 2005-12-22 | 2009-10-08 | Matsushita Electric Industrial Co., Ltd. | Image search apparatus and image search method |
US7978936B1 (en) * | 2006-01-26 | 2011-07-12 | Adobe Systems Incorporated | Indicating a correspondence between an image and an object |
US20070239683A1 (en) * | 2006-04-07 | 2007-10-11 | Eastman Kodak Company | Identifying unique objects in multiple image collections |
US20080131073A1 (en) * | 2006-07-04 | 2008-06-05 | Sony Corporation | Information processing apparatus and method, and program |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090288028A1 (en) * | 2008-05-19 | 2009-11-19 | Canon Kabushiki Kaisha | Apparatus and method for managing content |
US8549421B2 (en) * | 2008-05-19 | 2013-10-01 | Canon Kabushiki Kaisha | Apparatus and method for managing content |
US20100054601A1 (en) * | 2008-08-28 | 2010-03-04 | Microsoft Corporation | Image Tagging User Interface |
US9020183B2 (en) | 2008-08-28 | 2015-04-28 | Microsoft Technology Licensing, Llc | Tagging images with labels |
US20150016691A1 (en) * | 2008-08-28 | 2015-01-15 | Microsoft Corporation | Image Tagging User Interface |
US8867779B2 (en) * | 2008-08-28 | 2014-10-21 | Microsoft Corporation | Image tagging user interface |
US8824748B2 (en) | 2010-09-24 | 2014-09-02 | Facebook, Inc. | Auto tagging in geo-social networking system |
US9317530B2 (en) | 2011-03-29 | 2016-04-19 | Facebook, Inc. | Face recognition based on spatial and temporal proximity |
US9264392B2 (en) | 2011-04-29 | 2016-02-16 | Facebook, Inc. | Dynamic tagging recommendation |
US8631084B2 (en) | 2011-04-29 | 2014-01-14 | Facebook, Inc. | Dynamic tagging recommendation |
WO2012149332A3 (en) * | 2011-04-29 | 2013-03-21 | Facebook, Inc. | Dynamic tagging recommendation |
WO2012149332A2 (en) * | 2011-04-29 | 2012-11-01 | Facebook, Inc. | Dynamic tagging recommendation |
US8856922B2 (en) | 2011-11-30 | 2014-10-07 | Facebook, Inc. | Imposter account report management in a social networking system |
US8849911B2 (en) * | 2011-12-09 | 2014-09-30 | Facebook, Inc. | Content report management in a social networking system |
US20140365382A1 (en) * | 2011-12-09 | 2014-12-11 | Facebook, Inc. | Content Report Management in a Social Networking System |
US20130151609A1 (en) * | 2011-12-09 | 2013-06-13 | Yigal Dan Rubinstein | Content Report Management in a Social Networking System |
US9524490B2 (en) * | 2011-12-09 | 2016-12-20 | Facebook, Inc. | Content report management in a social networking system |
WO2014025185A1 (en) * | 2012-08-06 | 2014-02-13 | Samsung Electronics Co., Ltd. | Method and system for tagging information about image, apparatus and computer-readable recording medium thereof |
US10191616B2 (en) | 2012-08-06 | 2019-01-29 | Samsung Electronics Co., Ltd. | Method and system for tagging information about image, apparatus and computer-readable recording medium thereof |
US10528591B2 (en) | 2013-07-11 | 2020-01-07 | Facebook, Inc. | Methods and systems for using hints in media content tagging |
US9858298B1 (en) * | 2013-07-11 | 2018-01-02 | Facebook, Inc. | Methods and systems for using hints in media content tagging |
US9727752B2 (en) * | 2013-09-25 | 2017-08-08 | Kairos Social Solutions, Inc. | Device, system, and method of identifying a specific user from a profile image containing multiple people |
US20150089396A1 (en) * | 2013-09-25 | 2015-03-26 | Kairos Social Solutions, Inc. | Device, System, and Method of Identifying a specific user from a profile image containing multiple people |
US10121060B2 (en) * | 2014-02-13 | 2018-11-06 | Oath Inc. | Automatic group formation and group detection through media recognition |
US20150227609A1 (en) * | 2014-02-13 | 2015-08-13 | Yahoo! Inc. | Automatic group formation and group detection through media recognition |
US20150278207A1 (en) * | 2014-03-31 | 2015-10-01 | Samsung Electronics Co., Ltd. | Electronic device and method for acquiring image data |
US11899730B2 (en) * | 2022-05-19 | 2024-02-13 | Sgs Ventures Inc. | System and method for managing relationships, organization, retrieval, and sharing of different types of contents accessible by a computing device |
Also Published As
Publication number | Publication date |
---|---|
JP2010518505A (en) | 2010-05-27 |
WO2008097049A1 (en) | 2008-08-14 |
EP2118849A4 (en) | 2011-03-23 |
KR100796044B1 (en) | 2008-01-21 |
EP2118849A1 (en) | 2009-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100318510A1 (en) | Method for attaching tag to image of person | |
US10157191B2 (en) | Metadata tagging system, image searching method and device, and method for tagging a gesture thereof | |
US6879846B1 (en) | Destination calling control system and destination calling control method | |
US8339451B2 (en) | Image navigation with multiple images | |
US8375283B2 (en) | System, device, method, and computer program product for annotating media files | |
CN101809533A (en) | Apparatus and method for tagging items | |
US20120151398A1 (en) | Image Tagging | |
EP2369819A1 (en) | Communication terminal apparatus and communication method | |
US20100322401A1 (en) | Methods for transmitting image of person, displaying image of caller and retrieving image of person, based on tag information | |
US8060839B2 (en) | Character input method and mobile communication terminal using the same | |
CN105718500A (en) | Text-based content management method and apparatus of electronic device | |
US8456491B2 (en) | System to highlight differences in thumbnail images, mobile phone including system, and method | |
EP2028588A2 (en) | Method and apparatus for forwarding media objects to a cellular telephone user | |
KR20050017316A (en) | An Apparatus And Method For Managing A Phonebook In A Mobile Terminal Having Camera | |
JP2011198070A (en) | Business card & memo information cooperation management device, business card & memo information cooperation management method, and program | |
CN101238703A (en) | Ringing image for incoming calls | |
US8509749B2 (en) | Mobile communication apparatus and operating method thereof | |
KR101315800B1 (en) | Management Method of Tag Based Personal Information in the Portable Information Terminal | |
JP5428911B2 (en) | Mobile terminal device, telephone directory search method, and telephone directory search program | |
CN112948422A (en) | Contact person searching method and device and electronic equipment | |
KR20050046450A (en) | Content management method for handheld terminal | |
JP2007128160A (en) | Information processor | |
KR100673448B1 (en) | Mobile communication terminal searching memo and its operating method | |
JP2005346245A (en) | Manual information display system and information apparatus | |
JP2000305702A (en) | Character input system for electronics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OLAWORKS, INC., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RYU, JUNG-HEE;REEL/FRAME:023066/0782 Effective date: 20090805 |
|
AS | Assignment |
Owner name: INTEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLAWORKS;REEL/FRAME:028824/0075 Effective date: 20120615 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |