US20110069183A1 - Methods and apparatuses for identifying opportunities to capture content - Google Patents
Methods and apparatuses for identifying opportunities to capture content Download PDFInfo
- Publication number
- US20110069183A1 US20110069183A1 US12/958,189 US95818910A US2011069183A1 US 20110069183 A1 US20110069183 A1 US 20110069183A1 US 95818910 A US95818910 A US 95818910A US 2011069183 A1 US2011069183 A1 US 2011069183A1
- Authority
- US
- United States
- Prior art keywords
- user
- electronic device
- capture
- image
- suggestion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72457—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00204—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
- H04N1/00244—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
- H04W4/185—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals by embedding added-value information into content, e.g. geo-tagging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
- H04N2201/3205—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
- H04N2201/3207—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of an address
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3233—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of authentication information, e.g. digital signature, watermark
Definitions
- the user may not be aware of an opportunity to record sights nearby that interest the user. For example, the user may be visiting a historic building with a city. While visiting, the user is able to capture images of this historic building. Because the visitor is not familiar with this city, the user may not realize that another historic building is located several blocks away. Had the user known that a similar historic building was only several blocks away, the user would have visited this additional historic building as well.
- FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for synchronizing and tracking content are implemented
- FIG. 2 is a simplified block diagram illustrating one embodiment in which the methods and apparatuses for identifying opportunities to capture content are implemented;
- FIG. 5 is a flow diagram consistent with one embodiment of the methods and apparatuses for identifying opportunities to capture content
- references to “content” includes data such as photographs, images, video, text, graphics, and the like, that are embodied in digital or analog electronic form.
- FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for identifying opportunities to capture content are implemented.
- the environment includes an electronic device 110 (e.g., a computing platform configured to act as a client device, such as a computer, a personal digital assistant, a cellular telephone, a camera device), a user interface 115 , a network 120 (e.g., a local area network, a home network, the Internet), and a server 130 (e.g., a computing platform configured to act as a server).
- an electronic device 110 e.g., a computing platform configured to act as a client device, such as a computer, a personal digital assistant, a cellular telephone, a camera device
- a user interface 115 e.g., a user interface 115 , a network 120 (e.g., a local area network, a home network, the Internet), and a server 130 (e.g., a computing platform configured to act as a server).
- one or more user interface 115 components are made integral with the electronic device 110 (e.g., keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics (e.g., as in a Clie® manufactured by Sony Corporation).
- one or more user interface 115 components e.g., a keyboard, a pointing device (mouse, trackball, etc.), a microphone, a speaker, a display, a camera
- the user utilizes interface 115 to access and control content and applications stored in electronic device 110 , server 130 , or a remote storage device (not shown) coupled via network 120 .
- embodiments of identifying opportunities to capture content as described below are executed by an electronic processor in electronic device 110 , in server 130 , or by processors in electronic device 110 and in server 130 acting together.
- Server 130 is illustrated in FIG. 1 as being a single computing platform, but in other instances are two or more interconnected computing platforms that act as a server.
- the methods and apparatuses for identifying opportunities to capture content automatically updates the user profile based on the image(s) captured by the device.
- the plurality of client devices 110 and the server 130 include instructions for a customized application for identifying opportunities to capture content.
- the plurality of computer-readable medium 209 and 212 contain, in part, the customized application.
- the plurality of client devices 110 and the server 130 are configured to receive and transmit electronic messages for use with the customized application.
- the network 120 is configured to transmit electronic messages for use with the customized application.
- the review module 310 analyzes a profile associated with a user of the electronic device 110 .
- the profile includes different content types of pictures captured by the user.
- the user of the electronic device 110 captures pictures of monuments, scenic landscapes, churches, historical buildings, and the like.
- the location module 320 identifies a location of the electronic device 110 . In one embodiment, the location module 320 receives information identifying the location of the electronic device 110 and makes the location information available to the review module 310 . In one embodiment, the location of the electronic device is determined through a system of satellites such as a global positioning system (GPS). In another embodiment, the location of the electronic device 110 is determined locally by multiple sensors which pin point the device's location.
- GPS global positioning system
- the interface module 340 detects an action such as capturing a photograph through the electronic device 110 .
- the interface module 340 transmits different possible selection choices from the picture opportunity database.
- the identifying system 300 in FIG. 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for identifying opportunities to capture content. Additional modules may be added to the system 300 without departing from the scope of the methods and apparatuses for sequestering content. Similarly, modules may be combined or deleted without departing from the scope of the methods and apparatuses for identifying opportunities to capture content.
- FIG. 4 illustrates a simplified overview system 400 for identifying opportunities to capture images.
- the system 400 includes an electronic device 110 , a server 130 , locator system 410 , and a wireless network 420 .
- the record 400 is associated with a specific content.
- the electronic device 110 captures images under the direction of a user. In one instance, the electronic device 110 receives suggestions for opportunities to capture images based on the location of the electronic device 110 and the unique profile of the user. In another instance, the electronic device 110 provides detailed instructions about a particular opportunity of interest based on the selection of the user from the multiple suggestions.
- suggestions are transmitted to the camera device.
- these suggestions give the user choices for photographic opportunities based on the location of the camera device and the interests of the user.
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 10/771,902, filed on Feb. 4, 2004, incorporated herein by reference in its entirety, now U.S. Pat. No. ______.
- The present invention relates generally to identifying opportunities to capture content and, more particularly, to identifying opportunities to capture content based on location.
- With the use of electronic image capturing devices, there has been a proliferation of images that have been recorded by users. Users typically record content that captures their interests. In one example, a user utilizes the image capturing device to record images of historic buildings while on vacation. The content captured by users includes video tracks, graphic images, and photographs.
- In some cases while sightseeing on vacation, the user may not be aware of an opportunity to record sights nearby that interest the user. For example, the user may be visiting a historic building with a city. While visiting, the user is able to capture images of this historic building. Because the visitor is not familiar with this city, the user may not realize that another historic building is located several blocks away. Had the user known that a similar historic building was only several blocks away, the user would have visited this additional historic building as well.
- In one embodiment, the methods and apparatuses identifying opportunities to capture content sense a user profile; determine a geographic location of a device; and transmit at least one suggestion to the device based on the geographic location of the device and the user profile wherein the suggestion indicates a photo opportunity.
- The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate and explain one embodiment of the methods and apparatuses for identifying opportunities to capture content. In the drawings,
-
FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for synchronizing and tracking content are implemented; -
FIG. 2 is a simplified block diagram illustrating one embodiment in which the methods and apparatuses for identifying opportunities to capture content are implemented; -
FIG. 3 is a simplified block diagram illustrating a system, consistent with one embodiment of the methods and apparatuses for identifying opportunities to capture content; -
FIG. 4 is a simplified block diagram illustrating one embodiment in which the methods and apparatuses for identifying opportunities to capture content; -
FIG. 5 is a flow diagram consistent with one embodiment of the methods and apparatuses for identifying opportunities to capture content; -
FIG. 6 is a flow diagram consistent with one embodiment of the methods and apparatuses for identifying opportunities to capture content; and -
FIG. 7 is a flow diagram consistent with one embodiment of the methods and apparatuses for identifying opportunities to capture content. - The following detailed description of the methods and apparatuses for identifying opportunities to capture content refers to the accompanying drawings. The detailed description is not intended to limit the methods and apparatuses for identifying opportunities to capture content. Instead, the scope of the methods and apparatuses for identifying opportunities to capture content is defined by the appended claims and equivalents. Those skilled in the art will recognize that many other implementations are possible, consistent with the present invention.
- References to “content” includes data such as photographs, images, video, text, graphics, and the like, that are embodied in digital or analog electronic form.
- References to “electronic device” includes a device such as a digital still camera, a video camera, a personal digital assistant with an ability to capture an image, a cellular device with an ability to capture an image, and any electronic device with an ability to capture an image.
-
FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for identifying opportunities to capture content are implemented. The environment includes an electronic device 110 (e.g., a computing platform configured to act as a client device, such as a computer, a personal digital assistant, a cellular telephone, a camera device), auser interface 115, a network 120 (e.g., a local area network, a home network, the Internet), and a server 130 (e.g., a computing platform configured to act as a server). - In one embodiment, one or
more user interface 115 components are made integral with the electronic device 110 (e.g., keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics (e.g., as in a Clie® manufactured by Sony Corporation). In other embodiments, one ormore user interface 115 components (e.g., a keyboard, a pointing device (mouse, trackball, etc.), a microphone, a speaker, a display, a camera) are physically separate from, and are conventionally coupled to,electronic device 110. The user utilizesinterface 115 to access and control content and applications stored inelectronic device 110,server 130, or a remote storage device (not shown) coupled vianetwork 120. - In accordance with the invention, embodiments of identifying opportunities to capture content as described below are executed by an electronic processor in
electronic device 110, inserver 130, or by processors inelectronic device 110 and inserver 130 acting together.Server 130 is illustrated inFIG. 1 as being a single computing platform, but in other instances are two or more interconnected computing platforms that act as a server. - The methods and apparatuses for identifying opportunities to capture content are shown in the context of exemplary embodiments of applications in which content is suggested and identified based on the location of the electronic device. In one embodiment, the user utilizes content through the
electronic device 110 and thenetwork 120. In another embodiment, the content is tracked and synchronized by the application that is located within theserver 130 and/or theelectronic device 110. - In one embodiment, the methods and apparatuses for identifying opportunities to capture content is configured to provide a device with a list of suggestions based on a user profile and a location of the device. In one instance, the list of suggestions is selected from a photo opportunity database. Additionally, the user profile includes the type of content of interested by the user of the device in one embodiment.
- In one embodiment, the methods and apparatuses for identifying opportunities to capture content is configured to provide detailed information corresponding to a selection from the list of suggestions. In one instance, the detailed information includes directions to the selection. In another instance, the detailed information includes detailed, dynamic directions to the selection based on the real-time location of the device. In one embodiment, the detailed information includes sample images and text describing the selection.
- In one embodiment, the methods and apparatuses for identifying opportunities to capture content automatically updates the user profile based on the image(s) captured by the device.
-
FIG. 2 is a simplified diagram illustrating an exemplary architecture in which the methods and apparatuses for identifying opportunities to capture content are implemented. The exemplary architecture includes a plurality ofelectronic devices 110, aserver device 130, and anetwork 120 connectingelectronic devices 110 toserver 130 and eachelectronic device 110 to each other. The plurality ofelectronic devices 110 are each configured to include a computer-readable medium 209, such as random access memory, coupled to anelectronic processor 208.Processor 208 executes program instructions stored in the computer-readable medium 209. A unique user operates eachelectronic device 110 via aninterface 115 as described with reference toFIG. 1 . -
Server device 130 includes a processor 211 coupled to a computer-readable medium 212. In one embodiment, theserver device 130 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such asdatabase 240. - In one instance,
processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, Calif. In other instances, other microprocessors are used. - The plurality of
client devices 110 and theserver 130 include instructions for a customized application for identifying opportunities to capture content. In one embodiment, the plurality of computer-readable medium client devices 110 and theserver 130 are configured to receive and transmit electronic messages for use with the customized application. Similarly, thenetwork 120 is configured to transmit electronic messages for use with the customized application. - One or more user applications are stored in
memories 209, in memory 211, or a single user application is stored in part in onememory 209 and in part in memory 211. In one instance, a stored user application, regardless of storage location, is made customizable based on identifying opportunities to capture content as determined using embodiments described below. -
FIG. 3 illustrates one embodiment of an identifyingsystem 300. The identifyingsystem 300 includes areview module 310, alocation module 320, astorage module 330, aninterface module 340, and acontrol module 350. In one embodiment, thecontrol module 350 communicates with thereview module 310, thelocation module 320, thestorage module 330, and theinterface module 340. - In one embodiment, the
control module 350 coordinates tasks, requests, and communications between thereview module 310, thelocation module 320, thestorage module 330, and theinterface module 340. - In one embodiment, the
review module 310 analyzes a profile associated with a user of theelectronic device 110. In one embodiment, the profile includes different content types of pictures captured by the user. For example, the user of theelectronic device 110 captures pictures of monuments, scenic landscapes, churches, historical buildings, and the like. - In another embodiment, the
review module 310 identifies possible picture opportunities from selecting from a picture opportunity database based in part on the profile. In one embodiment, the picture opportunity database includes a listing of opportunities to capture a photograph and is organized by geography area and content type. For example, a church located in San Francisco, Calif. is listed as a photograph opportunity within the picture opportunity database. In this example, this listed church can be classified as a church, a historical building, a monument, and the like for content type within the picture opportunity database. In addition, this listed church can also be classified under “San Francisco” under geography area within the picture opportunity database. - In yet another embodiment, the
review module 310 creates a profile for the user based on the photographs captured by the user. For example, thereview module 310 reviews prior photographs captured by the user through theelectronic device 110. - In yet another embodiment, the
review module 310 updates an existing profile based on additional photographs captured by the user. For example, thereview module 310 monitors the photographic activity of the user through theelectronic device 110. - In one embodiment, the
location module 320 identifies a location of theelectronic device 110. In one embodiment, thelocation module 320 receives information identifying the location of theelectronic device 110 and makes the location information available to thereview module 310. In one embodiment, the location of the electronic device is determined through a system of satellites such as a global positioning system (GPS). In another embodiment, the location of theelectronic device 110 is determined locally by multiple sensors which pin point the device's location. - In yet another embodiment, the location of the
electronic device 110 is determined by sensing theelectronic device 110 within a general area. For example, theelectronic device 110 is sensed entering an entrance gate at an amusement park. In this embodiment, thelocation module 320 senses that theelectronic device 110 is within the amusement park. Although, the exact location of theelectronic device 110 is not available, thelocation module 320 ascertains the general area of theelectronic device 110. - In one embodiment, the
storage module 330 stores a profile associated a user of theelectronic device 110. In another embodiment, thestorage module 330 also stores the picture opportunity database. - In one embodiment, the
interface module 340 detects an action such as capturing a photograph through theelectronic device 110. - In another embodiment, the
interface module 340 transmits different possible selection choices from the picture opportunity database. - In yet another embodiment, the
interface module 340 interacts with user regarding the user's selection from the picture opportunity database. - In another embodiment, the
interface module 340 interacts with other devices. For example, in one instance, theinterface module 340 interacts with a GPS system for receiving geographical information regarding theelectronic device 110. - The identifying
system 300 inFIG. 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for identifying opportunities to capture content. Additional modules may be added to thesystem 300 without departing from the scope of the methods and apparatuses for sequestering content. Similarly, modules may be combined or deleted without departing from the scope of the methods and apparatuses for identifying opportunities to capture content. -
FIG. 4 illustrates asimplified overview system 400 for identifying opportunities to capture images. Thesystem 400 includes anelectronic device 110, aserver 130,locator system 410, and awireless network 420. Therecord 400 is associated with a specific content. - In one embodiment, the
electronic device 110 captures images under the direction of a user. In one instance, theelectronic device 110 receives suggestions for opportunities to capture images based on the location of theelectronic device 110 and the unique profile of the user. In another instance, theelectronic device 110 provides detailed instructions about a particular opportunity of interest based on the selection of the user from the multiple suggestions. - In one embodiment, the
server 130 monitors theelectronic device 110 and provides theelectronic device 110 with suggestions for opportunities to capture images. In one instance, theserver 130 maintains the unique profile for each user and the picture opportunity database. In another instance, theserver 130 detects activity of theelectronic device 110 and updates the profile for the user. In another instance, theserver 130 monitors the location of theelectronic device 110. - In one embodiment, the
locator system 410 determines the geographical location of theelectronic device 110. In one instance, thelocator system 410 is a GPS system that utilizes multiple satellites to track the location of the electronic device throughout the world. In another instance, thelocator system 410 comprises multiple local sensors to track the location of theelectronic device 110 within a localized area. - In one embodiment, the
locator system 410 communicates with theserver 130 through the electronic device. In another embodiment, thelocator system 410 communicates directly with theelectronic device 110. - In one embodiment, the
wireless network 420 facilitates communication between theelectronic device 110 and theserver 130. In one embodiment, thewireless network 420 is a cellular network. In another embodiment, thewireless network 420 is a wide area network. - The flow diagrams as depicted in
FIGS. 5 , 6, and 7 are one embodiment of the methods and apparatuses for synchronizing and tracking content. The blocks within the flow diagrams can be performed in a different sequence without departing from the spirit of the methods and apparatuses for synchronizing and tracking content. Further, blocks can be deleted, added, or combined without departing from the spirit of the methods and apparatuses for synchronizing and tracking content. - The flow diagram in
FIG. 5 illustrates identifying opportunities to capture content according to one embodiment of the invention. InBlock 510, a camera device is detected. In one embodiment, theserver 130 detects the camera device. - In
Block 520, a user's identity is determined. In one embodiment, the user interfaces with the camera device detected within theBlock 510. The user may be identified in numerous ways. In one instance, the user is identified through a password. In another instance, the user is identified through a biometric parameter such as a fingerprint, a DNA sample, an iris scan, and the like. In yet another instance, the user is identified through the use of a specific camera device. - In
Block 530, a unique profile corresponding to the user's identity is loaded within thesystem 300. In one embodiment, the unique profile is loaded among a plurality of profiles associated with other users. In one embodiment, the profile indicates the content types that the user has expressed interest. For example, the profile reflects that the user shows interest in content such as historic buildings, hot rod automobiles, and the like. In one instance, the content types range from general categories such as “buildings” to more refined subsets such as “interiors of cathedrals” being a subset of “buildings”. In other examples, any number of labels are utilized to describe the various content types that are listed within a profile. - In
Block 540, a location of the camera device is determined. In one embodiment, the location of the camera device is performed by thelocator system 410. In one instance, thelocator system 410 determines the general location of the camera device without finding the exact location. For example, for determining a generalized location of the camera device, a single cellular site is sufficient for detecting the camera device within a generalized area such as within a particular city. - In another instance, the
locator system 410 determines a specific location of the camera device through a GPS system such as at a particular street address. In one instance, the location of the camera device is determined within 3 feet. - In
Block 550, suggestions are transmitted to the camera device. In one embodiment, these suggestions give the user choices for photographic opportunities based on the location of the camera device and the interests of the user. - The flow diagram in
FIG. 6 illustrates identifying opportunities to capture content according to one embodiment of the invention. - In
Block 610, a unique profile is identified and the location of the camera device is also identified. - In
Block 620, a search is conducted within the picture opportunity database based on the unique profile and the location of the camera device. In one embodiment, if the location of the camera is within a city such as San Francisco, then, the photographic opportunities listed within the picture opportunity is limited to listings in San Francisco. In another embodiment, the photographic opportunities listed within the picture opportunity is limited to listings in San Francisco Bay Area. In yet another embodiment, the photographic opportunities listed within the picture opportunity is limited to listings in Northern California. - In one embodiment, the listings within the picture opportunity database are narrowed and refined based on the unique profile. For example, if the unique profile reflects that historic buildings are of interest, then the listings with the content type related to historical buildings are retained.
- In one embodiment, the number of retained listings ideally would not overwhelm the camera device while still providing the camera device with choices to select. For example, in one instance, having between 5 to 10 listings from the picture opportunity database allows enough choices without overwhelming the user and the camera device.
- There are a variety of ways to limit or expand the number of listing transmitted to the camera device. In one embodiment, the radius of geographical limits is dynamically varied to keep the number of listings offered to the camera device reasonable. For example, the geographic radius is enlarged when there are not enough listings to present to the camera device. In another example, the geographic radius is decreased when there are too many listings to present to the camera device.
- Further, the content type is expanded and broadened when there are not enough listings to present to the camera device in one embodiment. In this instance, instead of selecting “historic buildings” as the content type, “buildings” is utilized as the content type. Similarly, the content type is narrowed when there are not enough listings to present to the camera device in one embodiment. In this instance, instead of selecting “historic buildings” as the content type, “historic buildings in historic areas” is utilized as the content type.
- Further, the content type is expanded and broadened when there are not enough listings to present to the camera device in one embodiment. In this instance, instead of selecting “historic buildings” as the content type, “buildings” is utilized as the content type.
- In
Block 630, the listings identified in theBlock 620 are suggestions that are transmitted to the camera device. - In
Block 640, a particular selection from one of the suggestions is received from the camera device. In one embodiment, the selection requests additional suggestions for the camera device. In this instance, additional suggestions are identified in theBlock 620 and these additional suggestions are transmitted to the camera device in theBlock 630. - In another embodiment, the selection identifies one of the suggestions transmitted in the
Block 630. - In
Block 650, detailed information related to the selected suggestion is transmitted to the camera device. In one embodiment, the detailed information is stored within thestorage module 330. In one embodiment, the detailed information includes directions on how to reach this photographic opportunity, notes describing the environment of the opportunity, sample images to emulate, sample images with common mistakes, and the like. - In
Block 660, dynamic directions are transmitted to the camera device based on the real-time location of the camera device. For example, the dynamic directions are transmitted to the camera device calling out real-time directions of “turn right at the next light”, “make a u-turn here”, and the like. - The flow diagram in
FIG. 7 illustrates customizing photographs captured by the camera device according to one embodiment of the invention. InBlock 710, a particular selection from one of the suggestions is received from the camera device. - In
Block 720, a location of the camera device is determined. In one embodiment, the location of the camera device is performed by thelocator system 410. - In
Block 730, a captured image recorded by the camera device is detected. In one embodiment, the captured image is recorded by the camera device at the location determined in theBlock 720. - In
Block 740, the image captured by the camera device and the location of the camera device when the image was captured is compared with information related to the particular selection within theBlock 710. - In one embodiment, the information related to the particular selection is stored within the
storage module 310 within the picture opportunity database. For example, in one instance, the picture opportunity database includes information related to the particular selection such as a location, sample images, descriptive text associated with the sample images. - In one embodiment, the location of the camera device when the image was captured is compared to the location of the particular selection. If the camera device was located outside the area of the particular selection when the image was captured, then this image does not correspond to the sample images associated with the particular selection.
- In one embodiment, the captured image is compared with the sample images stored within the picture opportunity database associated with the particular selection. In one instance, commercially available image recognition and comparison algorithms are utilized to determine a sufficient match between the captured image and one of the sample images.
- In
Block 750, text associated with the sample image is selectively added to the captured image. In one embodiment, the text associated with the sample image is stored within the picture opportunity database. In one embodiment, the text associated with the sample image describes details of the captured image such as location of the sample image, description of the subject of the captured image, and the like. - In one embodiment, the text associated with the sample image is added to the captured image when the captured image is recorded by the camera device in a location that corresponds with the particular selection. In another embodiment, the sample image is added to the captured image when the captured image matches the sample image.
- In
Block 760, the profile corresponding to the user of the camera device is updated according to the captured image. For example, the captured image that is detected in theBlock 730 is utilized to refine the profile associated with the user of the camera device. - In one embodiment, the captured image is matched with a sample image corresponding to the particular selection from the comparison in the
Block 740. In this instance, the content type associated with the sample image is utilized to update the profile corresponding to the user of the camera device. - In another embodiment, the captured image does not correspond to a sample image and is independently matched to a content type. For example, since the captured image does not correspond to any of the sample images, the captured image is examined to match a content type unique to the captured image.
- The foregoing descriptions of specific embodiments of the invention have been presented for purposes of illustration and description. For example, the invention is described within the context of suggesting photo opportunities and capturing images as merely one embodiment of the invention. The invention may be applied to a variety of other applications.
- They are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed, and naturally many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/958,189 US20110069183A1 (en) | 2004-02-04 | 2010-12-01 | Methods and apparatuses for identifying opportunities to capture content |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/771,902 US7898572B2 (en) | 2004-02-04 | 2004-02-04 | Methods and apparatuses for identifying opportunities to capture content |
US12/958,189 US20110069183A1 (en) | 2004-02-04 | 2010-12-01 | Methods and apparatuses for identifying opportunities to capture content |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/771,902 Continuation US7898572B2 (en) | 2004-02-04 | 2004-02-04 | Methods and apparatuses for identifying opportunities to capture content |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110069183A1 true US20110069183A1 (en) | 2011-03-24 |
Family
ID=34808547
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/771,902 Expired - Fee Related US7898572B2 (en) | 2004-02-04 | 2004-02-04 | Methods and apparatuses for identifying opportunities to capture content |
US12/958,189 Abandoned US20110069183A1 (en) | 2004-02-04 | 2010-12-01 | Methods and apparatuses for identifying opportunities to capture content |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/771,902 Expired - Fee Related US7898572B2 (en) | 2004-02-04 | 2004-02-04 | Methods and apparatuses for identifying opportunities to capture content |
Country Status (6)
Country | Link |
---|---|
US (2) | US7898572B2 (en) |
EP (1) | EP1712033A4 (en) |
JP (1) | JP5173197B2 (en) |
KR (2) | KR101136648B1 (en) |
CN (2) | CN1914853B (en) |
WO (1) | WO2005076524A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110116420A1 (en) * | 2004-08-24 | 2011-05-19 | Comcast Cable Communications, Llc | Method and System for Locating a Voice over Internet Protocol (VOIP) Device Connected to a Network |
US9286294B2 (en) | 1992-12-09 | 2016-03-15 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator content suggestion engine |
US9813641B2 (en) | 2000-06-19 | 2017-11-07 | Comcast Ip Holdings I, Llc | Method and apparatus for targeting of interactive virtual objects |
US20170353655A1 (en) * | 2016-06-01 | 2017-12-07 | Canon Kabushiki Kaisha | Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium |
US10140433B2 (en) | 2001-08-03 | 2018-11-27 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator |
US10349096B2 (en) | 2001-08-03 | 2019-07-09 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator content coding and formatting |
US11956852B2 (en) | 2022-02-11 | 2024-04-09 | Comcast Cable Communications, Llc | Physical location management for voice over packet communication |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7853100B2 (en) * | 2006-08-08 | 2010-12-14 | Fotomedia Technologies, Llc | Method and system for photo planning and tracking |
US9373076B1 (en) * | 2007-08-08 | 2016-06-21 | Aol Inc. | Systems and methods for building and using social networks in image analysis |
JP4462331B2 (en) | 2007-11-05 | 2010-05-12 | ソニー株式会社 | Imaging apparatus, control method, program |
US20090160970A1 (en) * | 2007-12-20 | 2009-06-25 | Fredlund John R | Remote determination of image-acquisition settings and opportunities |
KR101441587B1 (en) * | 2008-01-30 | 2014-09-23 | 삼성전자 주식회사 | An apparatus for learning photographing profiles of digital imaging devices for personal lifelong recording and learning methods thereof |
US9509867B2 (en) * | 2008-07-08 | 2016-11-29 | Sony Corporation | Methods and apparatus for collecting image data |
US8726324B2 (en) * | 2009-03-27 | 2014-05-13 | Motorola Mobility Llc | Method for identifying image capture opportunities using a selected expert photo agent |
US20110022529A1 (en) * | 2009-07-22 | 2011-01-27 | Fernando Barsoba | Social network creation using image recognition |
US8228413B2 (en) * | 2009-09-01 | 2012-07-24 | Geovector Corp. | Photographer's guidance systems |
US20130286244A1 (en) * | 2010-03-23 | 2013-10-31 | Motorola Mobility Llc | System and Method for Image Selection and Capture Parameter Determination |
US20120151367A1 (en) * | 2010-12-10 | 2012-06-14 | Nokia Corporation | Method and apparatus for registering a content provider channel for recommendation of content segments |
US8781514B2 (en) * | 2010-12-16 | 2014-07-15 | Sony Corporation | System and method for establishing a communication session between context aware portable communication devices |
JP2013207357A (en) * | 2012-03-27 | 2013-10-07 | Sony Corp | Server, client terminal, system, and program |
KR20160114434A (en) * | 2015-03-24 | 2016-10-05 | 삼성전자주식회사 | Electronic Device And Method For Taking Images Of The Same |
US20180278565A1 (en) * | 2017-03-23 | 2018-09-27 | International Business Machines Corporation | Photo stimulus based on projected gaps/interest |
KR102620877B1 (en) * | 2019-01-09 | 2024-01-05 | 삼성전자주식회사 | Electronic device for recommending photographing spot for an image and method for operating thefeof |
US10630896B1 (en) * | 2019-02-14 | 2020-04-21 | International Business Machines Corporation | Cognitive dynamic photography guidance and pose recommendation |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6396537B1 (en) * | 1997-11-24 | 2002-05-28 | Eastman Kodak Company | Photographic system for enabling interactive communication between a camera and an attraction site |
US20020093435A1 (en) * | 2001-01-18 | 2002-07-18 | Baron John M. | Electronic tour guide and photo location finder |
US20030009281A1 (en) * | 2001-07-09 | 2003-01-09 | Whitham Charles Lamont | Interactive multimedia tour guide |
US20030020816A1 (en) * | 2001-07-27 | 2003-01-30 | Hunter Andrew Arthur | Image capturing device |
US6516154B1 (en) * | 2001-07-17 | 2003-02-04 | Eastman Kodak Company | Image revising camera and method |
US6526234B1 (en) * | 2001-07-17 | 2003-02-25 | Eastman Kodak Company | Revision suggestion camera and method |
US20030055901A1 (en) * | 2001-08-30 | 2003-03-20 | International Business Machines Corporation | Customized tours using handheld devices |
US20030140056A1 (en) * | 2002-01-18 | 2003-07-24 | Ford Motor Company | System and method for retrieving information using position coordinates |
US20040174434A1 (en) * | 2002-12-18 | 2004-09-09 | Walker Jay S. | Systems and methods for suggesting meta-information to a camera user |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1155726A (en) | 1997-08-06 | 1999-02-26 | Nippon Telegr & Teleph Corp <Ntt> | Method and device for information guidance for mobile user and recording medium recording information guidance program |
JP3719313B2 (en) * | 1997-08-08 | 2005-11-24 | 三菱電機株式会社 | Information search and distribution device for mobile communication terminal and mobile communication terminal |
US6112181A (en) * | 1997-11-06 | 2000-08-29 | Intertrust Technologies Corporation | Systems and methods for matching, selecting, narrowcasting, and/or classifying based on rights management and/or other information |
JP4273273B2 (en) * | 1999-01-14 | 2009-06-03 | 株式会社エクォス・リサーチ | Destination setting device |
JP2001036842A (en) | 1999-07-16 | 2001-02-09 | Canon Inc | Image processor, image processing method and storage medium |
JP2001216328A (en) | 2000-02-04 | 2001-08-10 | Canon Inc | Information processor, network system, image information providing method, and recording medium |
JP3636026B2 (en) | 2000-04-10 | 2005-04-06 | トヨタ自動車株式会社 | Travel information server |
JP3614756B2 (en) | 2000-04-18 | 2005-01-26 | 日本電信電話株式会社 | Personal adaptive information guidance method and system |
JP2002214681A (en) | 2001-01-22 | 2002-07-31 | Konica Corp | Camera with photography spot guiding function and photographic condition setting function, and photography spot guidance and photographic condition setting service system |
GB2403365B (en) * | 2003-06-27 | 2008-01-30 | Hewlett Packard Development Co | An autonomous camera having exchangeable behaviours |
-
2004
- 2004-02-04 US US10/771,902 patent/US7898572B2/en not_active Expired - Fee Related
-
2005
- 2005-01-27 KR KR1020117011652A patent/KR101136648B1/en not_active IP Right Cessation
- 2005-01-27 EP EP05722732A patent/EP1712033A4/en not_active Withdrawn
- 2005-01-27 CN CN2005800037378A patent/CN1914853B/en not_active Expired - Fee Related
- 2005-01-27 CN CN2011100042049A patent/CN102111444B/en not_active Expired - Fee Related
- 2005-01-27 KR KR1020067015700A patent/KR101060066B1/en not_active IP Right Cessation
- 2005-01-27 WO PCT/US2005/003547 patent/WO2005076524A1/en not_active Application Discontinuation
- 2005-01-27 JP JP2006552258A patent/JP5173197B2/en not_active Expired - Fee Related
-
2010
- 2010-12-01 US US12/958,189 patent/US20110069183A1/en not_active Abandoned
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6396537B1 (en) * | 1997-11-24 | 2002-05-28 | Eastman Kodak Company | Photographic system for enabling interactive communication between a camera and an attraction site |
US20020093435A1 (en) * | 2001-01-18 | 2002-07-18 | Baron John M. | Electronic tour guide and photo location finder |
US6459388B1 (en) * | 2001-01-18 | 2002-10-01 | Hewlett-Packard Company | Electronic tour guide and photo location finder |
US20030009281A1 (en) * | 2001-07-09 | 2003-01-09 | Whitham Charles Lamont | Interactive multimedia tour guide |
US6526351B2 (en) * | 2001-07-09 | 2003-02-25 | Charles Lamont Whitham | Interactive multimedia tour guide |
US6516154B1 (en) * | 2001-07-17 | 2003-02-04 | Eastman Kodak Company | Image revising camera and method |
US6526234B1 (en) * | 2001-07-17 | 2003-02-25 | Eastman Kodak Company | Revision suggestion camera and method |
US20030020816A1 (en) * | 2001-07-27 | 2003-01-30 | Hunter Andrew Arthur | Image capturing device |
US20030055901A1 (en) * | 2001-08-30 | 2003-03-20 | International Business Machines Corporation | Customized tours using handheld devices |
US20030140056A1 (en) * | 2002-01-18 | 2003-07-24 | Ford Motor Company | System and method for retrieving information using position coordinates |
US6731239B2 (en) * | 2002-01-18 | 2004-05-04 | Ford Motor Company | System and method for retrieving information using position coordinates |
US20040174434A1 (en) * | 2002-12-18 | 2004-09-09 | Walker Jay S. | Systems and methods for suggesting meta-information to a camera user |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9286294B2 (en) | 1992-12-09 | 2016-03-15 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator content suggestion engine |
US9813641B2 (en) | 2000-06-19 | 2017-11-07 | Comcast Ip Holdings I, Llc | Method and apparatus for targeting of interactive virtual objects |
US10349096B2 (en) | 2001-08-03 | 2019-07-09 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator content coding and formatting |
US10140433B2 (en) | 2001-08-03 | 2018-11-27 | Comcast Ip Holdings I, Llc | Video and digital multimedia aggregator |
US10517140B2 (en) | 2004-08-24 | 2019-12-24 | Comcast Cable Communications, Llc | Determining a location of a device for calling via an access point |
US9055550B1 (en) | 2004-08-24 | 2015-06-09 | Comcast Cable Holdings, Llc | Locating a voice over packet (VoP) device connected to a network |
US9648644B2 (en) | 2004-08-24 | 2017-05-09 | Comcast Cable Communications, Llc | Determining a location of a device for calling via an access point |
US9049132B1 (en) | 2004-08-24 | 2015-06-02 | Comcast Cable Holdings, Llc | Locating a voice over packet (VoP) device connected to a network |
US10070466B2 (en) | 2004-08-24 | 2018-09-04 | Comcast Cable Communications, Llc | Determining a location of a device for calling via an access point |
US9036626B2 (en) | 2004-08-24 | 2015-05-19 | Comcast Cable Holdings, Llc | Method and system for locating a voice over internet protocol (VOIP) device connected to a network |
US8724522B2 (en) * | 2004-08-24 | 2014-05-13 | Comcast Cable Holdings, Llc | Method and system for locating a voice over internet protocol (VoIP) device connected to a network |
US20110116420A1 (en) * | 2004-08-24 | 2011-05-19 | Comcast Cable Communications, Llc | Method and System for Locating a Voice over Internet Protocol (VOIP) Device Connected to a Network |
US11252779B2 (en) | 2004-08-24 | 2022-02-15 | Comcast Cable Communications, Llc | Physical location management for voice over packet communication |
US20170353655A1 (en) * | 2016-06-01 | 2017-12-07 | Canon Kabushiki Kaisha | Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium |
US10630894B2 (en) * | 2016-06-01 | 2020-04-21 | Canon Kabushiki Kaisha | Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium |
US11956852B2 (en) | 2022-02-11 | 2024-04-09 | Comcast Cable Communications, Llc | Physical location management for voice over packet communication |
Also Published As
Publication number | Publication date |
---|---|
KR101060066B1 (en) | 2011-08-29 |
CN1914853A (en) | 2007-02-14 |
US20050172147A1 (en) | 2005-08-04 |
US7898572B2 (en) | 2011-03-01 |
CN102111444A (en) | 2011-06-29 |
CN102111444B (en) | 2012-10-10 |
KR101136648B1 (en) | 2012-04-18 |
WO2005076524A1 (en) | 2005-08-18 |
WO2005076524B1 (en) | 2005-10-20 |
JP5173197B2 (en) | 2013-03-27 |
KR20060130642A (en) | 2006-12-19 |
CN1914853B (en) | 2011-03-16 |
JP2007520830A (en) | 2007-07-26 |
EP1712033A4 (en) | 2011-04-13 |
KR20110074784A (en) | 2011-07-01 |
EP1712033A1 (en) | 2006-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110069183A1 (en) | Methods and apparatuses for identifying opportunities to capture content | |
US9449228B1 (en) | Inferring locations from an image | |
CN101454639B (en) | System and method for displaying location-specific images ona mobile device | |
JP5739874B2 (en) | Search system and method based on orientation | |
US9269011B1 (en) | Graphical refinement for points of interest | |
US20120093369A1 (en) | Method, terminal device, and computer-readable recording medium for providing augmented reality using input image inputted through terminal device and information associated with same input image | |
US8935085B1 (en) | Switching between best views of a place | |
US20140204093A1 (en) | Map display system and map display method | |
US20140223319A1 (en) | System, apparatus and method for providing content based on visual search | |
US20090297067A1 (en) | Apparatus providing search service, method and program thereof | |
US20130328931A1 (en) | System and Method for Mobile Identification of Real Property by Geospatial Analysis | |
US9467660B1 (en) | Map generation using map features from user captured images | |
WO2005076913A2 (en) | Methods and apparatuses for formatting and displaying content | |
WO2005076896A2 (en) | Methods and apparatuses for broadcasting information | |
US20070276592A1 (en) | Method for deep mapping | |
KR20160141087A (en) | Providing system and method of moving picture contents for based on augmented reality location of multimedia broadcast scene | |
GB2412520A (en) | Image and location-based information viewer | |
TW201015354A (en) | Search methods and systems based on time information, and machine readable medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EDWARDS, ERIC;SATO, ROBERT;SIGNING DATES FROM 20101217 TO 20110112;REEL/FRAME:025779/0785 |
|
AS | Assignment |
Owner name: SONY ELECTRONICS, INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EDWARDS, ERIC;SATO, ROBERT;SIGNING DATES FROM 20101217 TO 20110112;REEL/FRAME:026132/0038 Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EDWARDS, ERIC;SATO, ROBERT;SIGNING DATES FROM 20101217 TO 20110112;REEL/FRAME:026132/0038 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |