US20130142344A1 - Automatic location-specific content selection for portable information retrieval devices - Google Patents
Automatic location-specific content selection for portable information retrieval devices Download PDFInfo
- Publication number
- US20130142344A1 US20130142344A1 US13/753,243 US201313753243A US2013142344A1 US 20130142344 A1 US20130142344 A1 US 20130142344A1 US 201313753243 A US201313753243 A US 201313753243A US 2013142344 A1 US2013142344 A1 US 2013142344A1
- Authority
- US
- United States
- Prior art keywords
- data
- user
- ird
- information
- management module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9537—Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R29/00—Monitoring arrangements; Testing arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
Definitions
- the present invention relates to applications pertaining to information retrieval devices (IRD's) connected to the Internet. More specifically, this invention relates to a method and apparatus for anticipating a user's desired information and delivering this information to the user through an IRD device, such as a personal digital assistant (PDA).
- IRD information retrieval devices
- PDA personal digital assistant
- IRD information retrieval devices
- PDA personal digital assistants
- IRD's remain unable to perform even the most basic task of anticipating the needs of the user and gathering information related to those needs, without the user having to enter such needs.
- a PDA is defined here as a handheld device that performs various computing functions for the user.
- a PDA is often referred to as a handheld personal computer, capable of performing such tasks as an address book, notepad, appointments diary, and phone list
- PDA technology capable of performing such tasks as an address book, notepad, appointments diary, and phone list
- these capabilities began to include more demanding applications such as spreadsheets, word processors, databases, financial management software, and games.
- Today, the emergence of wireless web technology has provided PDA manufacturers with the additional feature of accessing the Internet to market their respective products.
- the Internet is defined here as a collection of interconnected (public and/or private) networks linked together by a set of standard protocols (such as TCP/IP and HTTP) to form a global, distributed network. While this term is intended to refer to what is now commonly known as the Internet, it is also intended to encompass variations which may be made in the future, including changes and additions to existing standard protocols.
- a method for anticipating a user's desired information using an information retrieval device (IRD) connected to a computer network is provided.
- This method further comprises maintaining a database of user tendencies within the computer network, receiving sensor data from the user's physical environment via the IRD, generating query strings using both tendency data and sensor data, retrieving data from external data sources using these generated query strings, organizing the retrieved data into electronic folders, and delivering this organized data to the user via the IRD.
- a data management module anticipates the type of information a user desires by combining real time data taken from a sensor unit within an IRD and data regarding the history of that particular user's tendencies stored within the data management module.
- an IRD sensor unit may be comprised of a “Dictation” setting enabling the user to dictate a conversation in real time.
- a user may simply select a setting corresponding to “Dictation”.
- more sophisticated “Dictation” settings may also be implemented.
- Such embodiments may include a textual analysis which launches certain applications whenever particular names are detected. For example, if the name “John Doe” is detected, the IRD would relay any available information regarding “John Doe” to the user from the data management module.
- such information may include user-specific information obtained directly from the data management module (e.g., an address book, calendar, etc.) or information obtained from external data sources (e.g., an online database, search engine, etc.).
- the device may use both user tendencies and data relating to the physical environment in order to choose between and prioritize multiple matching results, as for example picking the ten “John Doe” matches that live closest to the physical location of the user and organizing them by that proximity, or by returning the “John Doe” matches that are lawyers based upon the user's tendency to request further information on past matches who were lawyers.
- Another such embodiment may include a hybrid recording and transcription sensor setting.
- the IRD may generate a running transcript which includes the locations of the speakers relative to the RD in order to differentiate between different speakers in a conversation.
- the sound is simply recorded and included in the transcript as a hyperlink.
- An additional feature to this embodiment may include a real time translator application which translates between languages.
- the RD sensor unit may include a GPS receiver that enables the IRD to retrieve information within the context of data received by the GPS receiver. For example, if the GPS signal indicates that the user is in Japan, the data management module may give weight to search results related to Japan.
- data ascertained from a GPS receiver may also be used to present information about a particular establishment or area the user is in.
- a real estate broker for example, may approach a home for sale and receive a list of information regarding that particular home. Such information may include the address of the home, the sale price of the home, and information regarding the neighborhood of the home organized in several electronic folders.
- the IRD may be used to analyze ambient sounds detected by its sensor unit.
- the IRD may, for example, identify the song and artist of music detected by the sensor unit through a spectral analysis of the sensor data.
- the IRD may also be used to detect sounds from a telephone touch dial and determine the numbers being dialed.
- An added feature to this embodiment may include a reverse lookup of the phone number which would display information regarding the person on the receiving end of the call.
- FIG. 1 is a block diagram demonstrating a preferred embodiment of the invention
- FIG. 2 is a flow chart illustrating the steps for users to access anticipated data according to an embodiment of the invention
- FIG. 3 is a schematic illustration of a sensor activation Web page according to an embodiment of the invention.
- FIG. 4 is a flow chart illustrating the steps for generating anticipated data according to an embodiment of the invention.
- FIG. 5 is a schematic illustration of a Web page with various electronic folders containing links to anticipated data according to an embodiment of the invention
- FIG. 6 is a schematic illustration of a Web page with various links to anticipated data according to an embodiment of the invention.
- FIG. 7 is a schematic illustration of a Web page displaying anticipated data according to an embodiment of the invention.
- the present invention is directed towards a method and apparatus for delivering content via informational retrieval devices.
- a data management module anticipates the type of information a user desires by combining real time data taken from a sensor unit connected to the IRD and data regarding the history of that particular user's tendencies stored within the data management module.
- FIG. 1 a block diagram is illustrated of a wide area network employing a method and apparatus according to an embodiment of the invention. It is anticipated that the present invention operates with a plurality of computers which are coupled together on a wide area network, such as the Internet 20 , or other communications network.
- FIG. 1 depicts such a network which includes an information retrieval device (IRD) 10 , a data management module 30 , and an external data source 40 .
- the IRD 10 is further comprised of an applications processor 13 coupled to a controller 11 , a display unit 12 , a Web browser. 14 , a context memory 15 , and a data memory 16 .
- the context memory 15 is shown connected to both a sensor unit 17 and to the data memory 16 .
- the sensor unit 17 is also shown to be connected to an analog-to-digital (AID) converter 18 which is directly connected to the data memory 16 .
- AID analog-to-digital
- a user determines which sensors it wants to activate using the controller 11 of the IRD 10 . These selections are then received by the applications processor 13 where they are relayed to the context memory 15 .
- the context memory 15 includes a set of instructions that activate particular sensors comprising the sensor unit 17 of the IRD 10 .
- the function of the context memory 15 is thus analogous to an instruction cache for the sensor unit 17 .
- Analog sensor data is then passed from the sensor unit to the AID converter where it is converted to digital data. This digital data is then compressed and temporarily stored in the IRD 10 data memory unit 16 until it is ready to be sent to the data management module 30 .
- the IRD 10 communicates with the data management module 30 and external data sources 40 via the Internet 20 .
- the data management module 30 is further comprised of a core processor 31 coupled to a client information database 35 , a search string database 39 , a search engine 37 , and a Web server 33 connected to an HTML (Hyper-Text Markup Language) documents database 34 .
- HTML Hyper-Text Markup Language
- a search engine 37 typically incorporates a database engine, such as a SQL ServerTM engine from Microsoft Corporation or OracleTM database engine, as part of their architecture. Search engines typically perform searches by operating on a string of characters, known as a “query string.”
- a query string is coded according to a set of rules determined by the database engine and/or a user interface between the database engine and the user.
- a “query” is broader than a “query string,” denoting both the query string and the search logic represented by the query string, whereas “query string” refers only to a string of characters, symbols, or codes used to define a query.
- Web server 33 accesses a plurality of Web pages, distributable applications, and other electronic files containing information of various types stored in the HTML documents database.
- Web pages may be viewed on various web-enabled computers in a given network, such as the information retrieval device 10 .
- a particular Web page or other electronic file may be viewed through a suitable application program residing on the information retrieval device 10 such as a browser 14 , or by a distributable application provided to the information retrieval device 10 , by the Web server 33 .
- a suitable application program residing on the information retrieval device 10 such as a browser 14
- a distributable application provided to the information retrieval device 10
- a user identifies a Web page it wishes to retrieve using the information retrieval device 10 by communicating an HTTP (Hyper-Text Transport Protocol) request from the browser application 14 .
- the HTTP request includes the Uniform Resource Locator (URL) of the desired Web page, which may correspond to an HTML document stored in the HTML documents database .
- the HTTP request is then routed to the Web server 33 via the Internet 20 .
- the Web server 33 retrieves the HTML document identified by the URL, and communicates the HTML document across the Internet 20 to the browser application 14 .
- the HTML document may be communicated in the form of plural message packets as defined by standard protocols, such as the Transport Control Protocol/Internet Protocol (TCP/IP).
- TCP/IP Transport Control Protocol/Internet Protocol
- the IRD 10 provides users with information received from a data management module 30 .
- a data management module 30 anticipates the type of information a user desires through real time data taken from the sensor unit 17 of the IRD 10 . It should, however, be appreciated that users may have the option of retrieving information by compiling a query which combines this sensor data with data regarding the history of that particular user's “tendencies” stored in the client information database 35 .
- “Tendencies” are defined here as a topical measure of a user's information interests
- Various methods may be used to determine a user's tendencies, such as, for example, providing a form or Web page for the user to designate topical areas of interest, analyzing the user's demographic and purchasing information to ascertain likely areas of interest, and analyzing a record or history of the user's prior queries. It may be particularly useful to rank user tendencies in priority order. For example, if tendencies are to be determined from a record of prior queries, the tendencies could be ranked according to factors such as, for example, the frequency with which the same or similar queries have been repeated, the number of related queries in a topical area, and the length of time since the query was repeated.
- These and various other methods as known in the art may be used to determine a user's tendencies, and the invention is not limited by the method that the determination is made.
- FIG. 2 a flow chart illustrating the steps for users to access such anticipated data according to an embodiment of the invention is shown.
- the procedure begins with power being applied to the IRD 10 at step 100 .
- the user is then asked which particular sensors it would like to activate at step 106 .
- An example of a sensor activation We page displayed to the user according to an embodiment of the invention is shown in FIG. 3 .
- the IRD 10 display unit 12 is comprised of various fields.
- Such fields may comprise a plurality of sensor fields including a “Sensor One” field 200 , a “Sensor Two” field 205 , a “Sensor Three” field 210 , as well as all other sensor fields up to sensor field n 215 (where n represents the total number of sensors available to the user).
- Other fields displayed to the user may include fields used to scroll through other sensors, not currently displayed, such as an “Additional Sensors” field 220 and a “Previous Sensors” field 225 . If the user wishes to activate specific sensors at step 105 , then these sensors are selected by the user at step 110 and activated by the IRD 10 at step 120 by selecting the “Begin” field 235 illustrated in FIG. 3 ; otherwise, the user selects the “Default Settings” field 230 causing the IRD 10 to select sensors specified by its default settings at step 115 , and thus activating those default sensors corresponding to these settings at step 120 .
- the IRD 10 begins to scan its environment according to these sensor settings at step 125 .
- this scan procedure may include a plurality of sensors of various types.
- An exemplary embodiment of the invention may, therefore, include sensors such as a microphone and GPS (Global Positioning System) receiver that respectively scan the IRD 10 environment for sound and location.
- GPS Global Positioning System
- a user may choose to activate both the microphone and the GPS receiver in order to simultaneously ascertain data from both devices.
- Other embodiments of sensors may include, but are not limited to, light sensors and motion sensors.
- the IRD 10 then receives analog data from its active sensors at step 130 and converts this data into digital data using the A/D converter 18 at step 135 .
- This digital data is then compressed and temporarily stored in the IRD's data memory unit 16 at step 140 until it is sent to the data management module 30 at step 145 via the Internet 20 .
- the procedure then continues by having the IRD 10 receive compressed feedback data from the data management module 30 at step 150 .
- This data is then decompressed at step 155 and displayed to the user at step 160 .
- FIG. 4 a flow chart illustrating the steps for generating the feedback data received by the IRD 10 according to an embodiment of the invention is shown.
- This procedure begins with the data management module 30 receiving a data signal from the IRD 10 at step 300 .
- the received data is then decompressed at step 305 in order to generate primary search strings from data taken from the IRD 10 sensor unit 17 at step 310 .
- the primary search strings generated at step 310 may be viewed as simple query strings, found within the search string database 39 using sensor unit 17 data, which may be used to ascertain information from conventional search engines available through the Internet 20 .
- Such primary search strings may, for example, include “restaurants in city y”, which would correspond to data received from both a sound sensor, sensing that the word “restaurant” was said, and from a GPS receiver sensor, sensing that the user is in “city y”.
- the procedure continues with the data management module 30 determining the identity of the client at step 315 .
- the data management module 30 then opens the appropriate customer file from the client information database 35 which optionally opens the client's profile of navigation tendencies. It should be appreciated that these navigation tendencies may be repeatedly calculated at a user-defined rate from within the core processor 31 using an arbitrary statistical weighting system determined either by the user or the IRD 10 manufacturer.
- an internal search string database 39 may be used simultaneously with the client information database 35 to generate secondary search strings that directly correspond to the tendencies of the user.
- the core processor 31 searches for more specific search strings from within the search string database 39 which more accurately reflect the anticipated information desired by the user according to tendency data stored in the customer information database 35 . These search strings are then combined with the primary search strings found at step 310 in order to generate secondary search strings at step 325 .
- the procedure continues at step 330 with an external data source 40 search being made according to the secondary search strings found at step 325 .
- This external data source 40 may, for example, be provided by a conventional search engine, an external database service provider, or any other data source available via the Internet 20 .
- an internal weighting algorithm is again implemented in order to determine which returned search results best match the information desired by the user as anticipated by the data management module 30 .
- the data management module 30 selects only those search results receiving a criterion score above some predetermined threshold at step 335 , and organizes these selected search results into various electronic folders at step 340 . This data is then compressed by the data management module 30 at step 345 , and finally sent to the IRD 10 at step 350 .
- FIG. 5 An example of a Web page including such electronic folders displayed to the user according to an embodiment of the invention is shown in FIG. 5 .
- the PDA 10 display unit 12 is again comprised of various fields.
- the display unit 12 may be comprised of a plurality of folder fields which include a “Folder One” field 400 , a “Folder Two” field 405 , a “Folder Three” field 410 as well as all other folder fields up to folder field n 415 (where n represents the total number of folders displayed to the user).
- Other fields displayed to the user may include fields used to scroll through other folders, not currently displayed, such as an “Additional Folders” field 420 and a “Previous Folders” field 425 . If none of these folders include the user's desired search results, the user may enter its own search string in the field labeled “Search Field” 435 . It should be appreciated that any search string entered by the user via the “Search Field”, is used by the data management module 30 at step 330 of the flow chart illustrated in FIG. 4 in order to extract data from an external data source 40 according to this particular search string. It should be further appreciated that, at any time, the user may exit the Web page illustrated in FIG. 5 by selecting the “Home” field 430 in order to modify the active sensor settings. As a result of this selection, the IRD 10 would redisplay the sensor activation Web page illustrated in FIG. 3 and thus return the user to step 105 of the flow chart illustrated in FIG. 2 .
- the IRD 10 display unit 12 is again comprised of various fields.
- the display unit 12 may be comprised of a plurality of link fields which include a “Link One” field 500 , a “Link Two” field 505 , a “Link Three” field 510 as well as all other link fields up to link field n 515 (where n represents the total number of links displayed to the user).
- Other fields displayed to the user may include fields used to scroll through other links, not currently displayed, such as an “Additional Links” field 520 and a “Previous Links” field 525 . If none of these links include the user's desired search results, the user may again enter its own search string in the field labeled “Search Field” 535 where, similar to the electronic folder Web page described with respect to FIG. 5 , this string is used by the data management module 30 at step 330 of the flow chart illustrated in FIG. 4 . Also similar to the electronic folder Web page described with respect to FIG. 5 , the user may exit the Web page illustrated in FIG. 6 by selecting the “Home” field 530 in order to modify the active sensor settings at any time.
- FIG. 7 An example of a Web page including such fields according to an embodiment of the invention is shown in FIG. 7 .
- the display unit 12 pertaining to this particular Web page may be comprised of a plurality of fields which include the aforementioned “Selected Data” field 600 containing the data corresponding to the selected link.
- the data displayed to the user in the “Selected Data” field 600 may be provided in various forms.
- an investor may wish to analyze the performance of a particular stock.
- the investor may be presented with a set of links which may include links to graphs, spreadsheets, or news regarding that particular stock.
- One such field may include a “Related Folders” field 605 which may be used in order to generate a Web page similar to the one illustrated in FIG. 5 based on a modified search which includes strings related to the data currently being displayed in the “Selected Data” field 600 .
- a “Back” and “Forward” field, 610 and 615 respectively, may also be included in order to navigate through the various Web pages selected by the user. Similar to the Web pages described above with respect to FIGS. 5 and 6 , a “Home” field 620 and a “Search Field” 625 having the same functionality as previously described may be included as well.
- an IRD 10 sensor unit 17 may be comprised of a “Dictation” setting enabling the user to dictate a conversation in real time.
- a user may simply select the setting corresponding to “Dictation”.
- more sophisticated “Dictation” settings may also be implemented.
- Such embodiments may include a textual analysis which launches certain applications whenever particular word patterns or words matching the user's tendency data are detected. For example, if the name “John Doe” is repeatedly detected, the IRD 10 may relay any available information regarding “John Doe” to the user from the data management module 30 .
- the address book and/or other information pertaining to “John Doe” may be retrieved and displayed. It should be appreciated that such information may include user-specific information obtained directly from the data management module 30 (e.g., an address book, calendar, etc.) or information obtained from external data sources 40 (e.g., an online database, search engine, etc.).
- the IRD 10 may retrieve data on selected terms within the context of other terms being used.
- the data management module 30 may conduct a search for these two terms together instead of individually.
- the data management module 30 may return information describing the function of a “head end” in conjunction with a cable system.
- Another such embodiment may include a hybrid recording and transcription sensor setting which may be used in conjunction with information retrieval services or as a separate feature.
- the IRD 10 may generate a running transcript which includes the locations of the speakers relative to the IRD 10 in order to differentiate between different speakers in a conversation. In cases where the IRD 10 is unable to convert a sound into a word, the sound is simply recorded and included in the transcript as a hyperlink.
- An additional feature to this embodiment may include a real time translator application which translates between languages.
- the IRD 10 sensor unit 17 may include a GPS receiver that enables the IRD 10 to retrieve information within the context of data received by the GPS receiver. For example, if the GPS signal indicates that the user is in Japan, the data management module 30 may give weight to search results related to Japan. Thus, in the prior example, the terms “head end” and “cable” might also return information about Japanese cable television operators.
- the IRD 10 may also anticipate the user's needs by retrieving more specific GPS location data. For example, if the user is entering a video rental store, the IRD 10 may retrieve a list of the most popular rental videos for that week, together with movies being shown on the user's local cable system. Similarly, a user entering a hardware store might be presented with the hardware store's current advertised specials, together with links for reviews of those items. Furthermore, a user may ask a salesperson a question regarding “washers” from which the IRD 10 may distinguish the user's request as being one for plumbing device “washers”, as opposed to home appliance “washers”, because the user is located in a hardware store. In this respect, the data management module 30 would conduct its search accordingly.
- data ascertained from a GPS receiver may also be used to present information about a particular establishment or area the user is in.
- a real estate broker for example, may approach a home for sale and receive a list of information regarding that particular home. Such information may include the address of the home, the sale price of the home, and information regarding the neighborhood of the home organized in several electronic folders,
- the IRD 10 may be used to analyze ambient sounds detected by its sensor unit 17 .
- the IRD 10 may, for example, identify the song and artist of music detected by the sensor unit 17 through a spectral analysis of the sensor data.
- the IRD 10 may also be used to detect sounds from a telephone touch dial and determine the numbers being dialed.
- An added feature to this embodiment may include a reverse lookup of the phone number which would display information regarding the person on the receiving end of the call.
Abstract
A method for anticipating a user's desired information using a FDA device connected to a computer network is provided. This method further comprises maintaining a database of user tendencies within the computer network, receiving sensor data from the user's physical environment via the FDA device, generating query strings using both tendency data and sensor data, retrieving data from external data sources using these generated query strings, organizing the retrieved data into electronic folders, and delivering this organized data to the user via the FDA device, In particular, a data management module anticipates the type of information a user desires by combining real time data taken from a sensor unit within a FDA and data regarding the history of that particular user's tendencies stored within the data management module.
Description
- This application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application No. 60/203,169, filed May 8, 2000, which application is specifically incorporated herein, in its entirety, by reference.
- 1. Field of the Invention
- The present invention relates to applications pertaining to information retrieval devices (IRD's) connected to the Internet. More specifically, this invention relates to a method and apparatus for anticipating a user's desired information and delivering this information to the user through an IRD device, such as a personal digital assistant (PDA).
- 2. Description of Related Art
- Computers are currently being used as electronic encyclopedias, with searches becoming increasingly sophisticated and with larger amounts of data being available to the user. Computers suitable for performing such tasks are referred to herein as information retrieval devices (IRD's). A highly portable and particularly useful embodiment of an IRD is represented by personal digital assistants (PDA's). However, IRD's remain unable to perform even the most basic task of anticipating the needs of the user and gathering information related to those needs, without the user having to enter such needs.
- The market for IRD's has become increasingly popular over the past few years. For style-conscious users looking for the latest in electronic organization, PDA's are an attractive option because of their ability. to provide users with a plethora of computing functions in a small, portable device. A PDA is defined here as a handheld device that performs various computing functions for the user. In this respect, a PDA is often referred to as a handheld personal computer, capable of performing such tasks as an address book, notepad, appointments diary, and phone list With the growth of PDA technology, however, these capabilities began to include more demanding applications such as spreadsheets, word processors, databases, financial management software, and games. Today, the emergence of wireless web technology has provided PDA manufacturers with the additional feature of accessing the Internet to market their respective products.
- It should be appreciated that the Internet is defined here as a collection of interconnected (public and/or private) networks linked together by a set of standard protocols (such as TCP/IP and HTTP) to form a global, distributed network. While this term is intended to refer to what is now commonly known as the Internet, it is also intended to encompass variations which may be made in the future, including changes and additions to existing standard protocols.
- Although the ability to retrieve information from the Internet using PDA devices is generally known in the art, an automated procedure for retrieving anticipated information generated according to sensory data taken from the PDA does not exist. It would thus be advantageous to implement a method and apparatus which anticipates a user's desired information and delivers this information directly to the user through a PDA device.
- In an embodiment of the invention, a method for anticipating a user's desired information using an information retrieval device (IRD) connected to a computer network is provided. This method further comprises maintaining a database of user tendencies within the computer network, receiving sensor data from the user's physical environment via the IRD, generating query strings using both tendency data and sensor data, retrieving data from external data sources using these generated query strings, organizing the retrieved data into electronic folders, and delivering this organized data to the user via the IRD. In particular, a data management module anticipates the type of information a user desires by combining real time data taken from a sensor unit within an IRD and data regarding the history of that particular user's tendencies stored within the data management module.
- In one such embodiment, an IRD sensor unit may be comprised of a “Dictation” setting enabling the user to dictate a conversation in real time. In this type of embodiment, a user may simply select a setting corresponding to “Dictation”. In other embodiments, more sophisticated “Dictation” settings may also be implemented. Such embodiments may include a textual analysis which launches certain applications whenever particular names are detected. For example, if the name “John Doe” is detected, the IRD would relay any available information regarding “John Doe” to the user from the data management module. It should be appreciated that such information may include user-specific information obtained directly from the data management module (e.g., an address book, calendar, etc.) or information obtained from external data sources (e.g., an online database, search engine, etc.). It should further be appreciated that the device may use both user tendencies and data relating to the physical environment in order to choose between and prioritize multiple matching results, as for example picking the ten “John Doe” matches that live closest to the physical location of the user and organizing them by that proximity, or by returning the “John Doe” matches that are lawyers based upon the user's tendency to request further information on past matches who were lawyers.
- Another such embodiment may include a hybrid recording and transcription sensor setting. In this embodiment, the IRD may generate a running transcript which includes the locations of the speakers relative to the RD in order to differentiate between different speakers in a conversation. In cases where the RD is unable to convert a sound into a word, the sound is simply recorded and included in the transcript as a hyperlink. An additional feature to this embodiment may include a real time translator application which translates between languages.
- In another embodiment, the RD sensor unit may include a GPS receiver that enables the IRD to retrieve information within the context of data received by the GPS receiver. For example, if the GPS signal indicates that the user is in Japan, the data management module may give weight to search results related to Japan.
- It should be appreciated that data ascertained from a GPS receiver may also be used to present information about a particular establishment or area the user is in. In this type of embodiment, a real estate broker, for example, may approach a home for sale and receive a list of information regarding that particular home. Such information may include the address of the home, the sale price of the home, and information regarding the neighborhood of the home organized in several electronic folders.
- In another embodiment, the IRD may be used to analyze ambient sounds detected by its sensor unit. In this embodiment, the IRD may, for example, identify the song and artist of music detected by the sensor unit through a spectral analysis of the sensor data. Similarly, the IRD may also be used to detect sounds from a telephone touch dial and determine the numbers being dialed. An added feature to this embodiment may include a reverse lookup of the phone number which would display information regarding the person on the receiving end of the call.
- A more complete understanding of a method and apparatus for delivering content via informational retrieval devices will be afforded to those skilled in the art, as well as a realization of additional advantages and objects thereof, by a consideration of the following detailed description of the preferred embodiment. Reference will be made to the appended sheets of drawings which will first be described briefly.
-
FIG. 1 is a block diagram demonstrating a preferred embodiment of the invention; -
FIG. 2 is a flow chart illustrating the steps for users to access anticipated data according to an embodiment of the invention; -
FIG. 3 is a schematic illustration of a sensor activation Web page according to an embodiment of the invention; -
FIG. 4 is a flow chart illustrating the steps for generating anticipated data according to an embodiment of the invention; -
FIG. 5 is a schematic illustration of a Web page with various electronic folders containing links to anticipated data according to an embodiment of the invention; -
FIG. 6 is a schematic illustration of a Web page with various links to anticipated data according to an embodiment of the invention; and -
FIG. 7 is a schematic illustration of a Web page displaying anticipated data according to an embodiment of the invention. - The present invention is directed towards a method and apparatus for delivering content via informational retrieval devices. In particular, a data management module anticipates the type of information a user desires by combining real time data taken from a sensor unit connected to the IRD and data regarding the history of that particular user's tendencies stored within the data management module. In the detailed description that follows, it should be appreciated that like element numerals are used to describe like elements illustrated in one or more figures.
- Referring first to
FIG. 1 , a block diagram is illustrated of a wide area network employing a method and apparatus according to an embodiment of the invention. It is anticipated that the present invention operates with a plurality of computers which are coupled together on a wide area network, such as the Internet 20, or other communications network.FIG. 1 depicts such a network which includes an information retrieval device (IRD) 10, a data management module 30, and an external data source 40. The IRD 10 is further comprised of an applications processor 13 coupled to acontroller 11, adisplay unit 12, a Web browser. 14, a context memory 15, and a data memory 16. The context memory 15 is shown connected to both a sensor unit 17 and to the data memory 16. The sensor unit 17 is also shown to be connected to an analog-to-digital (AID)converter 18 which is directly connected to the data memory 16. - In a preferred embodiment, a user determines which sensors it wants to activate using the
controller 11 of theIRD 10. These selections are then received by the applications processor 13 where they are relayed to the context memory 15. The context memory 15 includes a set of instructions that activate particular sensors comprising the sensor unit 17 of theIRD 10. The function of the context memory 15 is thus analogous to an instruction cache for the sensor unit 17. Analog sensor data is then passed from the sensor unit to the AID converter where it is converted to digital data. This digital data is then compressed and temporarily stored in theIRD 10 data memory unit 16 until it is ready to be sent to the data management module 30. - As illustrated, the
IRD 10 communicates with the data management module 30 and external data sources 40 via the Internet 20. The data management module 30 is further comprised of a core processor 31 coupled to aclient information database 35, a search string database 39, asearch engine 37, and aWeb server 33 connected to an HTML (Hyper-Text Markup Language) documentsdatabase 34. - It should be appreciated that a
search engine 37 typically incorporates a database engine, such as a SQL Server™ engine from Microsoft Corporation or Oracle™ database engine, as part of their architecture. Search engines typically perform searches by operating on a string of characters, known as a “query string.” A query string is coded according to a set of rules determined by the database engine and/or a user interface between the database engine and the user. As used herein, a “query” is broader than a “query string,” denoting both the query string and the search logic represented by the query string, whereas “query string” refers only to a string of characters, symbols, or codes used to define a query. - It should be further appreciated that
Web server 33 accesses a plurality of Web pages, distributable applications, and other electronic files containing information of various types stored in the HTML documents database. As a result, Web pages may be viewed on various web-enabled computers in a given network, such as theinformation retrieval device 10. For example, a particular Web page or other electronic file may be viewed through a suitable application program residing on theinformation retrieval device 10 such as a browser 14, or by a distributable application provided to theinformation retrieval device 10, by theWeb server 33. It should be appreciated that many different information retrieval devices, many different Web servers, and many different search servers of various types may be communicating with each other at the same time. - As is generally known in the art, a user identifies a Web page it wishes to retrieve using the
information retrieval device 10 by communicating an HTTP (Hyper-Text Transport Protocol) request from the browser application 14. The HTTP request includes the Uniform Resource Locator (URL) of the desired Web page, which may correspond to an HTML document stored in the HTML documents database . The HTTP request is then routed to theWeb server 33 via the Internet 20. TheWeb server 33 then retrieves the HTML document identified by the URL, and communicates the HTML document across the Internet 20 to the browser application 14. The HTML document may be communicated in the form of plural message packets as defined by standard protocols, such as the Transport Control Protocol/Internet Protocol (TCP/IP). - In a preferred embodiment of the invention, the
IRD 10 provides users with information received from a data management module 30. In particular, a data management module 30 anticipates the type of information a user desires through real time data taken from the sensor unit 17 of theIRD 10. It should, however, be appreciated that users may have the option of retrieving information by compiling a query which combines this sensor data with data regarding the history of that particular user's “tendencies” stored in theclient information database 35. “Tendencies” are defined here as a topical measure of a user's information interests, Various methods may be used to determine a user's tendencies, such as, for example, providing a form or Web page for the user to designate topical areas of interest, analyzing the user's demographic and purchasing information to ascertain likely areas of interest, and analyzing a record or history of the user's prior queries. It may be particularly useful to rank user tendencies in priority order. For example, if tendencies are to be determined from a record of prior queries, the tendencies could be ranked according to factors such as, for example, the frequency with which the same or similar queries have been repeated, the number of related queries in a topical area, and the length of time since the query was repeated. These and various other methods as known in the art may be used to determine a user's tendencies, and the invention is not limited by the method that the determination is made. - In
FIG. 2 , a flow chart illustrating the steps for users to access such anticipated data according to an embodiment of the invention is shown. The procedure begins with power being applied to theIRD 10 atstep 100. The user is then asked which particular sensors it would like to activate at step 106. An example of a sensor activation We page displayed to the user according to an embodiment of the invention is shown inFIG. 3 . As illustrated, theIRD 10display unit 12 is comprised of various fields. Such fields may comprise a plurality of sensor fields including a “Sensor One”field 200, a “Sensor Two”field 205, a “Sensor Three”field 210, as well as all other sensor fields up to sensor field n 215 (where n represents the total number of sensors available to the user). Other fields displayed to the user may include fields used to scroll through other sensors, not currently displayed, such as an “Additional Sensors”field 220 and a “Previous Sensors”field 225. If the user wishes to activate specific sensors atstep 105, then these sensors are selected by the user atstep 110 and activated by theIRD 10 atstep 120 by selecting the “Begin”field 235 illustrated inFIG. 3 ; otherwise, the user selects the “Default Settings”field 230 causing theIRD 10 to select sensors specified by its default settings atstep 115, and thus activating those default sensors corresponding to these settings atstep 120. - Once the appropriate sensors are activated at
step 120, theIRD 10 begins to scan its environment according to these sensor settings atstep 125. It should be appreciated that this scan procedure may include a plurality of sensors of various types. An exemplary embodiment of the invention may, therefore, include sensors such as a microphone and GPS (Global Positioning System) receiver that respectively scan theIRD 10 environment for sound and location. Within such embodiment, a user may choose to activate both the microphone and the GPS receiver in order to simultaneously ascertain data from both devices. Other embodiments of sensors may include, but are not limited to, light sensors and motion sensors. - Returning to the flow chart illustrated in.
FIG. 2 , theIRD 10 then receives analog data from its active sensors atstep 130 and converts this data into digital data using the A/D converter 18 atstep 135. This digital data is then compressed and temporarily stored in the IRD's data memory unit 16 atstep 140 until it is sent to the data management module 30 atstep 145 via the Internet 20. The procedure then continues by having theIRD 10 receive compressed feedback data from the data management module 30 atstep 150. This data is then decompressed atstep 155 and displayed to the user atstep 160. - In
FIG. 4 , a flow chart illustrating the steps for generating the feedback data received by theIRD 10 according to an embodiment of the invention is shown. This procedure begins with the data management module 30 receiving a data signal from theIRD 10 atstep 300. The received data is then decompressed atstep 305 in order to generate primary search strings from data taken from theIRD 10 sensor unit 17 atstep 310. The primary search strings generated atstep 310 may be viewed as simple query strings, found within the search string database 39 using sensor unit 17 data, which may be used to ascertain information from conventional search engines available through the Internet 20. Such primary search strings may, for example, include “restaurants in city y”, which would correspond to data received from both a sound sensor, sensing that the word “restaurant” was said, and from a GPS receiver sensor, sensing that the user is in “city y”. - Returning to the flow chart illustrated in
FIG. 4 , the procedure continues with the data management module 30 determining the identity of the client atstep 315. Atstep 320, the data management module 30 then opens the appropriate customer file from theclient information database 35 which optionally opens the client's profile of navigation tendencies. It should be appreciated that these navigation tendencies may be repeatedly calculated at a user-defined rate from within the core processor 31 using an arbitrary statistical weighting system determined either by the user or theIRD 10 manufacturer. - In order to both narrow and customize these searches, an internal search string database 39 may be used simultaneously with the
client information database 35 to generate secondary search strings that directly correspond to the tendencies of the user. In particular, the core processor 31 searches for more specific search strings from within the search string database 39 which more accurately reflect the anticipated information desired by the user according to tendency data stored in thecustomer information database 35. These search strings are then combined with the primary search strings found atstep 310 in order to generate secondary search strings atstep 325. - The procedure continues at
step 330 with an external data source 40 search being made according to the secondary search strings found atstep 325. It should be appreciated that the type of external data source 40 used atstep 330 may be provided by various embodiments. This external data source 40 may, for example, be provided by a conventional search engine, an external database service provider, or any other data source available via the Internet 20. Depending on the type of data the data management module 30 is attempting to extract from these data sources 40, an internal weighting algorithm is again implemented in order to determine which returned search results best match the information desired by the user as anticipated by the data management module 30. The data management module 30 then selects only those search results receiving a criterion score above some predetermined threshold atstep 335, and organizes these selected search results into various electronic folders atstep 340. This data is then compressed by the data management module 30 atstep 345, and finally sent to theIRD 10 atstep 350. - An example of a Web page including such electronic folders displayed to the user according to an embodiment of the invention is shown in
FIG. 5 . Similar to the sensor activation Web page described with respect toFIG. 3 , thePDA 10display unit 12 is again comprised of various fields. In this case, however, thedisplay unit 12 may be comprised of a plurality of folder fields which include a “Folder One”field 400, a “Folder Two”field 405, a “Folder Three”field 410 as well as all other folder fields up to folder field n 415 (where n represents the total number of folders displayed to the user). Other fields displayed to the user may include fields used to scroll through other folders, not currently displayed, such as an “Additional Folders”field 420 and a “Previous Folders”field 425. If none of these folders include the user's desired search results, the user may enter its own search string in the field labeled “Search Field” 435. It should be appreciated that any search string entered by the user via the “Search Field”, is used by the data management module 30 atstep 330 of the flow chart illustrated inFIG. 4 in order to extract data from an external data source 40 according to this particular search string. It should be further appreciated that, at any time, the user may exit the Web page illustrated inFIG. 5 by selecting the “Home”field 430 in order to modify the active sensor settings. As a result of this selection, theIRD 10 would redisplay the sensor activation Web page illustrated inFIG. 3 and thus return the user to step 105 of the flow chart illustrated inFIG. 2 . - Once a user has selected a particular folder from the Web page illustrated in
FIG. 5 , another Web page is displayed to the user listing various links related to the selected folder. An example of a Web page. including such links according to an embodiment of the invention is shown inFIG. 6 . Similar to the aforementioned Web pages described inFIGS. 3 and 5 , theIRD 10display unit 12 is again comprised of various fields. In this case, however, thedisplay unit 12 may be comprised of a plurality of link fields which include a “Link One”field 500, a “Link Two”field 505, a “Link Three”field 510 as well as all other link fields up to link field n 515 (where n represents the total number of links displayed to the user). Other fields displayed to the user may include fields used to scroll through other links, not currently displayed, such as an “Additional Links”field 520 and a “Previous Links”field 525. If none of these links include the user's desired search results, the user may again enter its own search string in the field labeled “Search Field” 535 where, similar to the electronic folder Web page described with respect toFIG. 5 , this string is used by the data management module 30 atstep 330 of the flow chart illustrated inFIG. 4 . Also similar to the electronic folder Web page described with respect toFIG. 5 , the user may exit the Web page illustrated inFIG. 6 by selecting the “Home”field 530 in order to modify the active sensor settings at any time. - Once a user has selected a particular link from the Web page illustrated in
FIG. 6 , another Web page containing various fields, including a field displaying the data corresponding to the selected link, is displayed to the user. An example of a Web page including such fields according to an embodiment of the invention is shown inFIG. 7 . As illustrated, thedisplay unit 12 pertaining to this particular Web page may be comprised of a plurality of fields which include the aforementioned “Selected Data”field 600 containing the data corresponding to the selected link. It should be appreciated that the data displayed to the user in the “Selected Data”field 600 may be provided in various forms. In one embodiment, for example, an investor may wish to analyze the performance of a particular stock. Within this scenario, the investor may be presented with a set of links which may include links to graphs, spreadsheets, or news regarding that particular stock. - Several other fields may also be included on the Web page illustrated in
FIG. 7 . One such field may include a “Related Folders”field 605 which may be used in order to generate a Web page similar to the one illustrated inFIG. 5 based on a modified search which includes strings related to the data currently being displayed in the “Selected Data”field 600. A “Back” and “Forward” field, 610 and 615 respectively, may also be included in order to navigate through the various Web pages selected by the user. Similar to the Web pages described above with respect toFIGS. 5 and 6 , a “Home”field 620 and a “Search Field” 625 having the same functionality as previously described may be included as well. - Within the context of the aforementioned flow charts, it should be appreciated that a plurality of embodiments which include several different types of sensor settings can be envisioned. In one such embodiment, an
IRD 10 sensor unit 17 may be comprised of a “Dictation” setting enabling the user to dictate a conversation in real time. In this type of embodiment, a user may simply select the setting corresponding to “Dictation”. In other embodiments, more sophisticated “Dictation” settings may also be implemented. Such embodiments may include a textual analysis which launches certain applications whenever particular word patterns or words matching the user's tendency data are detected. For example, if the name “John Doe” is repeatedly detected, theIRD 10 may relay any available information regarding “John Doe” to the user from the data management module 30. For further example, if “John Doe” is detected, and “John Doe” comprises an entry in the user's address book database, the address book and/or other information pertaining to “John Doe” may be retrieved and displayed. It should be appreciated that such information may include user-specific information obtained directly from the data management module 30 (e.g., an address book, calendar, etc.) or information obtained from external data sources 40 (e.g., an online database, search engine, etc.). - Similarly, the
IRD 10 may retrieve data on selected terms within the context of other terms being used. Thus, for example, if the term “head end” is used repeatedly with the term “cable”, the data management module 30 may conduct a search for these two terms together instead of individually. As a result, the data management module 30 may return information describing the function of a “head end” in conjunction with a cable system. - Another such embodiment may include a hybrid recording and transcription sensor setting which may be used in conjunction with information retrieval services or as a separate feature. In this embodiment, the
IRD 10 may generate a running transcript which includes the locations of the speakers relative to theIRD 10 in order to differentiate between different speakers in a conversation. In cases where theIRD 10 is unable to convert a sound into a word, the sound is simply recorded and included in the transcript as a hyperlink. An additional feature to this embodiment may include a real time translator application which translates between languages. - It should be appreciated that the
IRD 10 sensor unit 17 may include a GPS receiver that enables theIRD 10 to retrieve information within the context of data received by the GPS receiver. For example, if the GPS signal indicates that the user is in Japan, the data management module 30 may give weight to search results related to Japan. Thus, in the prior example, the terms “head end” and “cable” might also return information about Japanese cable television operators. - It should be further appreciated that the
IRD 10 may also anticipate the user's needs by retrieving more specific GPS location data. For example, if the user is entering a video rental store, theIRD 10 may retrieve a list of the most popular rental videos for that week, together with movies being shown on the user's local cable system. Similarly, a user entering a hardware store might be presented with the hardware store's current advertised specials, together with links for reviews of those items. Furthermore, a user may ask a salesperson a question regarding “washers” from which theIRD 10 may distinguish the user's request as being one for plumbing device “washers”, as opposed to home appliance “washers”, because the user is located in a hardware store. In this respect, the data management module 30 would conduct its search accordingly. - It should also be appreciated that data ascertained from a GPS receiver may also be used to present information about a particular establishment or area the user is in. In this type of embodiment, a real estate broker, for example, may approach a home for sale and receive a list of information regarding that particular home. Such information may include the address of the home, the sale price of the home, and information regarding the neighborhood of the home organized in several electronic folders,
- In another embodiment, the
IRD 10 may be used to analyze ambient sounds detected by its sensor unit 17. In this embodiment, theIRD 10 may, for example, identify the song and artist of music detected by the sensor unit 17 through a spectral analysis of the sensor data. Similarly, theIRD 10 may also be used to detect sounds from a telephone touch dial and determine the numbers being dialed. An added feature to this embodiment may include a reverse lookup of the phone number which would display information regarding the person on the receiving end of the call. - Having thus described several embodiments of a method and apparatus for delivering content via information retrieval devices, it should be apparent to those skilled in the art that certain advantages of the within system have been achieved, It should also be appreciated that various modifications, adaptations, and alternative embodiments thereof may be made within the scope and spirit of the present invention. The invention is further defined by the following claims.
Claims (4)
1.-23. (canceled)
24. A method, comprising:
a computing device detecting ambient audio information via a sensor unit;
the computing device analyzing the detected audio information; and
based on the analyzing, the computing device determining information regarding an origin of the detected audio information.
25. The method of claim 24 , wherein the analyzing includes performing a spectral analysis on the detected audio information.
26. The method of claim 24 , wherein the detected audio information corresponds to a work of music, and wherein the information regarding the origin of the detected audio information includes a name and artist of the work of music.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/753,243 US20130142344A1 (en) | 2000-05-08 | 2013-01-29 | Automatic location-specific content selection for portable information retrieval devices |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US20316900P | 2000-05-08 | 2000-05-08 | |
US09/850,956 US7228327B2 (en) | 2000-05-08 | 2001-05-08 | Method and apparatus for delivering content via information retrieval devices |
US11/748,985 US20070294064A1 (en) | 2000-05-08 | 2007-05-15 | Automatic location-specific content selection for portable information retrieval devices |
US13/753,243 US20130142344A1 (en) | 2000-05-08 | 2013-01-29 | Automatic location-specific content selection for portable information retrieval devices |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/748,985 Continuation US20070294064A1 (en) | 2000-05-08 | 2007-05-15 | Automatic location-specific content selection for portable information retrieval devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130142344A1 true US20130142344A1 (en) | 2013-06-06 |
Family
ID=26898372
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/850,956 Expired - Lifetime US7228327B2 (en) | 2000-05-08 | 2001-05-08 | Method and apparatus for delivering content via information retrieval devices |
US11/748,985 Abandoned US20070294064A1 (en) | 2000-05-08 | 2007-05-15 | Automatic location-specific content selection for portable information retrieval devices |
US13/753,243 Abandoned US20130142344A1 (en) | 2000-05-08 | 2013-01-29 | Automatic location-specific content selection for portable information retrieval devices |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/850,956 Expired - Lifetime US7228327B2 (en) | 2000-05-08 | 2001-05-08 | Method and apparatus for delivering content via information retrieval devices |
US11/748,985 Abandoned US20070294064A1 (en) | 2000-05-08 | 2007-05-15 | Automatic location-specific content selection for portable information retrieval devices |
Country Status (1)
Country | Link |
---|---|
US (3) | US7228327B2 (en) |
Families Citing this family (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6912517B2 (en) * | 2001-11-29 | 2005-06-28 | Koninklijke Philips Electronics N.V. | Intelligent information delivery system |
US20050021666A1 (en) * | 2002-10-08 | 2005-01-27 | Dinnage David M. | System and method for interactive communication between matched users |
JP4179013B2 (en) * | 2003-03-24 | 2008-11-12 | 富士ゼロックス株式会社 | Instruction management system |
US7434170B2 (en) * | 2003-07-09 | 2008-10-07 | Microsoft Corporation | Drag and drop metadata editing |
US7392477B2 (en) * | 2003-07-18 | 2008-06-24 | Microsoft Corporation | Resolving metadata matched to media content |
US7293227B2 (en) * | 2003-07-18 | 2007-11-06 | Microsoft Corporation | Associating image files with media content |
US20050015389A1 (en) * | 2003-07-18 | 2005-01-20 | Microsoft Corporation | Intelligent metadata attribute resolution |
US20050054381A1 (en) * | 2003-09-05 | 2005-03-10 | Samsung Electronics Co., Ltd. | Proactive user interface |
US7756388B2 (en) * | 2005-03-21 | 2010-07-13 | Microsoft Corporation | Media item subgroup generation from a library |
US20060218187A1 (en) * | 2005-03-25 | 2006-09-28 | Microsoft Corporation | Methods, systems, and computer-readable media for generating an ordered list of one or more media items |
US7647346B2 (en) * | 2005-03-29 | 2010-01-12 | Microsoft Corporation | Automatic rules-based device synchronization |
US20060230349A1 (en) * | 2005-04-06 | 2006-10-12 | Microsoft Corporation | Coalesced per-file device synchronization status |
US7890513B2 (en) * | 2005-06-20 | 2011-02-15 | Microsoft Corporation | Providing community-based media item ratings to users |
US7580932B2 (en) * | 2005-07-15 | 2009-08-25 | Microsoft Corporation | User interface for establishing a filtering engine |
US10911894B2 (en) | 2005-09-14 | 2021-02-02 | Verizon Media Inc. | Use of dynamic content generation parameters based on previous performance of those parameters |
US8103545B2 (en) | 2005-09-14 | 2012-01-24 | Jumptap, Inc. | Managing payment for sponsored content presented to mobile communication facilities |
US8156128B2 (en) | 2005-09-14 | 2012-04-10 | Jumptap, Inc. | Contextual mobile content placement on a mobile communication facility |
US8302030B2 (en) | 2005-09-14 | 2012-10-30 | Jumptap, Inc. | Management of multiple advertising inventories using a monetization platform |
US9201979B2 (en) | 2005-09-14 | 2015-12-01 | Millennial Media, Inc. | Syndication of a behavioral profile associated with an availability condition using a monetization platform |
US8819659B2 (en) | 2005-09-14 | 2014-08-26 | Millennial Media, Inc. | Mobile search service instant activation |
US20110106614A1 (en) * | 2005-11-01 | 2011-05-05 | Jumptap, Inc. | Mobile User Characteristics Influenced Search Results |
US7769764B2 (en) | 2005-09-14 | 2010-08-03 | Jumptap, Inc. | Mobile advertisement syndication |
US7676394B2 (en) | 2005-09-14 | 2010-03-09 | Jumptap, Inc. | Dynamic bidding and expected value |
US7660581B2 (en) | 2005-09-14 | 2010-02-09 | Jumptap, Inc. | Managing sponsored content based on usage history |
US8660891B2 (en) | 2005-11-01 | 2014-02-25 | Millennial Media | Interactive mobile advertisement banners |
US8989718B2 (en) | 2005-09-14 | 2015-03-24 | Millennial Media, Inc. | Idle screen advertising |
US8532633B2 (en) | 2005-09-14 | 2013-09-10 | Jumptap, Inc. | System for targeting advertising content to a plurality of mobile communication facilities |
US8238888B2 (en) | 2006-09-13 | 2012-08-07 | Jumptap, Inc. | Methods and systems for mobile coupon placement |
US9703892B2 (en) | 2005-09-14 | 2017-07-11 | Millennial Media Llc | Predictive text completion for a mobile communication facility |
US8364540B2 (en) | 2005-09-14 | 2013-01-29 | Jumptap, Inc. | Contextual targeting of content using a monetization platform |
US8229914B2 (en) | 2005-09-14 | 2012-07-24 | Jumptap, Inc. | Mobile content spidering and compatibility determination |
US20070061242A1 (en) * | 2005-09-14 | 2007-03-15 | Jorey Ramer | Implicit searching for mobile content |
US20110145076A1 (en) * | 2005-09-14 | 2011-06-16 | Jorey Ramer | Mobile Campaign Creation |
US8209344B2 (en) | 2005-09-14 | 2012-06-26 | Jumptap, Inc. | Embedding sponsored content in mobile applications |
US8311888B2 (en) | 2005-09-14 | 2012-11-13 | Jumptap, Inc. | Revenue models associated with syndication of a behavioral profile using a monetization platform |
US7702318B2 (en) | 2005-09-14 | 2010-04-20 | Jumptap, Inc. | Presentation of sponsored content based on mobile transaction event |
US8503995B2 (en) | 2005-09-14 | 2013-08-06 | Jumptap, Inc. | Mobile dynamic advertisement creation and placement |
US8832100B2 (en) | 2005-09-14 | 2014-09-09 | Millennial Media, Inc. | User transaction history influenced search results |
US7752209B2 (en) | 2005-09-14 | 2010-07-06 | Jumptap, Inc. | Presenting sponsored content on a mobile communication facility |
US20110313853A1 (en) | 2005-09-14 | 2011-12-22 | Jorey Ramer | System for targeting advertising content to a plurality of mobile communication facilities |
US20110153428A1 (en) * | 2005-09-14 | 2011-06-23 | Jorey Ramer | Targeted advertising to specified mobile communication facilities |
US7577665B2 (en) * | 2005-09-14 | 2009-08-18 | Jumptap, Inc. | User characteristic influenced search results |
US8195133B2 (en) | 2005-09-14 | 2012-06-05 | Jumptap, Inc. | Mobile dynamic advertisement creation and placement |
US9471925B2 (en) | 2005-09-14 | 2016-10-18 | Millennial Media Llc | Increasing mobile interactivity |
US9076175B2 (en) | 2005-09-14 | 2015-07-07 | Millennial Media, Inc. | Mobile comparison shopping |
US8666376B2 (en) | 2005-09-14 | 2014-03-04 | Millennial Media | Location based mobile shopping affinity program |
US9058406B2 (en) | 2005-09-14 | 2015-06-16 | Millennial Media, Inc. | Management of multiple advertising inventories using a monetization platform |
US8805339B2 (en) | 2005-09-14 | 2014-08-12 | Millennial Media, Inc. | Categorization of a mobile user profile based on browse and viewing behavior |
US7912458B2 (en) | 2005-09-14 | 2011-03-22 | Jumptap, Inc. | Interaction analysis and prioritization of mobile content |
US8812526B2 (en) | 2005-09-14 | 2014-08-19 | Millennial Media, Inc. | Mobile content cross-inventory yield optimization |
US8131271B2 (en) | 2005-11-05 | 2012-03-06 | Jumptap, Inc. | Categorization of a mobile user profile based on browse behavior |
US10038756B2 (en) | 2005-09-14 | 2018-07-31 | Millenial Media LLC | Managing sponsored content based on device characteristics |
US8615719B2 (en) | 2005-09-14 | 2013-12-24 | Jumptap, Inc. | Managing sponsored content for delivery to mobile communication facilities |
US8364521B2 (en) | 2005-09-14 | 2013-01-29 | Jumptap, Inc. | Rendering targeted advertisement on mobile communication facilities |
US8688671B2 (en) | 2005-09-14 | 2014-04-01 | Millennial Media | Managing sponsored content based on geographic region |
US10592930B2 (en) | 2005-09-14 | 2020-03-17 | Millenial Media, LLC | Syndication of a behavioral profile using a monetization platform |
US8175585B2 (en) | 2005-11-05 | 2012-05-08 | Jumptap, Inc. | System for targeting advertising content to a plurality of mobile communication facilities |
US7685210B2 (en) * | 2005-12-30 | 2010-03-23 | Microsoft Corporation | Media discovery and curation of playlists |
JP4728860B2 (en) * | 2006-03-29 | 2011-07-20 | 株式会社東芝 | Information retrieval device |
US20070244856A1 (en) * | 2006-04-14 | 2007-10-18 | Microsoft Corporation | Media Search Scope Expansion |
US20080109489A1 (en) * | 2006-11-03 | 2008-05-08 | Adrian Sherwood | Method For Generating Reports |
KR100888364B1 (en) * | 2006-11-08 | 2009-03-11 | 한국전자통신연구원 | Apparatus for processing of integrated data of various sensor networks and its method |
US11265355B2 (en) | 2007-08-24 | 2022-03-01 | Iheartmedia Management Services, Inc. | Customized perishable media content based on user-specified preference for static or variable location |
US9990655B2 (en) | 2007-08-24 | 2018-06-05 | Iheartmedia Management Services, Inc. | Live media stream including personalized notifications |
US9699232B2 (en) | 2007-08-24 | 2017-07-04 | Iheartmedia Management Services, Inc. | Adding perishable content to media stream based on user location preference |
US8126867B2 (en) * | 2007-10-24 | 2012-02-28 | The Invention Science Fund I, Llc | Returning a second content based on a user's reaction to a first content |
US20090112849A1 (en) * | 2007-10-24 | 2009-04-30 | Searete Llc | Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content |
US9513699B2 (en) * | 2007-10-24 | 2016-12-06 | Invention Science Fund I, LL | Method of selecting a second content based on a user's reaction to a first content |
US20090113297A1 (en) * | 2007-10-24 | 2009-04-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Requesting a second content based on a user's reaction to a first content |
US8001108B2 (en) * | 2007-10-24 | 2011-08-16 | The Invention Science Fund I, Llc | Returning a new content based on a person's reaction to at least two instances of previously displayed content |
US20090112693A1 (en) * | 2007-10-24 | 2009-04-30 | Jung Edward K Y | Providing personalized advertising |
US9582805B2 (en) * | 2007-10-24 | 2017-02-28 | Invention Science Fund I, Llc | Returning a personalized advertisement |
US8112407B2 (en) * | 2007-10-24 | 2012-02-07 | The Invention Science Fund I, Llc | Selecting a second content based on a user's reaction to a first content |
US8234262B2 (en) * | 2007-10-24 | 2012-07-31 | The Invention Science Fund I, Llc | Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content |
US20090112696A1 (en) * | 2007-10-24 | 2009-04-30 | Jung Edward K Y | Method of space-available advertising in a mobile device |
US20090112694A1 (en) * | 2007-10-24 | 2009-04-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Targeted-advertising based on a sensed physiological response by a person to a general advertisement |
US20090112697A1 (en) * | 2007-10-30 | 2009-04-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Providing personalized advertising |
US20090181352A1 (en) * | 2008-01-15 | 2009-07-16 | Pauline Hood | Multiple student behavior counter |
US8156046B2 (en) * | 2008-08-30 | 2012-04-10 | Yang Pan | Methods of rendering recommended media assets to a user by employing a handheld media player |
SG176724A1 (en) * | 2009-06-25 | 2012-01-30 | Astrazeneca Ab | Method for treating a patient at risk for developing an nsaid-associated ulcer |
US8121618B2 (en) | 2009-10-28 | 2012-02-21 | Digimarc Corporation | Intuitive computing methods and systems |
CN102667840A (en) * | 2009-12-15 | 2012-09-12 | 英特尔公司 | Context information utilizing systems, apparatus and methods |
US9484046B2 (en) | 2010-11-04 | 2016-11-01 | Digimarc Corporation | Smartphone-based methods and systems |
US20120246044A1 (en) * | 2011-03-25 | 2012-09-27 | Bank Of America | Account and Investment Market Monitoring Tools |
EP3925676A1 (en) | 2011-08-18 | 2021-12-22 | Pfaqutruma Research LLC | Systems and methods of virtual world interaction |
US9324236B2 (en) * | 2011-11-23 | 2016-04-26 | The Boeing Company | System and methods for situation awareness, advisory, tracking, and aircraft control information |
EP2613495A1 (en) * | 2012-01-09 | 2013-07-10 | OÜ Eliko Tehnoloogia Arenduskeskus | Method for determining digital content preferences of the user |
US9460237B2 (en) | 2012-05-08 | 2016-10-04 | 24/7 Customer, Inc. | Predictive 411 |
WO2013192113A1 (en) | 2012-06-18 | 2013-12-27 | Brian Mark Shuster | Transfer of virtual objects between applications |
US9495456B2 (en) | 2012-06-25 | 2016-11-15 | Google Inc. | Selecting, ranking, and/or presenting microsite content |
US9354778B2 (en) | 2013-12-06 | 2016-05-31 | Digimarc Corporation | Smartphone-based methods and systems |
US9311639B2 (en) | 2014-02-11 | 2016-04-12 | Digimarc Corporation | Methods, apparatus and arrangements for device to device communication |
EP3236363A1 (en) * | 2016-04-18 | 2017-10-25 | Nokia Technologies Oy | Content search |
Citations (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6570991B1 (en) * | 1996-12-18 | 2003-05-27 | Interval Research Corporation | Multi-feature speech/music discrimination system |
US7562392B1 (en) * | 1999-05-19 | 2009-07-14 | Digimarc Corporation | Methods of interacting with audio and ambient music |
US20100009722A1 (en) * | 1995-07-27 | 2010-01-14 | Levy Kenneth L | Connected Audio and Other Media Objects |
US7693965B2 (en) * | 1993-11-18 | 2010-04-06 | Digimarc Corporation | Analyzing audio, including analyzing streaming audio signals |
US7724919B2 (en) * | 1994-10-21 | 2010-05-25 | Digimarc Corporation | Methods and systems for steganographic processing |
US20100150395A1 (en) * | 1999-05-19 | 2010-06-17 | Davis Bruce L | Data Transmission by Extracted or Calculated Identifying Data |
US7751596B2 (en) * | 1996-11-12 | 2010-07-06 | Digimarc Corporation | Methods and arrangements employing digital content items |
US20100172540A1 (en) * | 2000-02-04 | 2010-07-08 | Davis Bruce L | Synchronizing Rendering of Multimedia Content |
US20100185306A1 (en) * | 1999-05-19 | 2010-07-22 | Rhoads Geoffrey B | Methods and Systems Employing Digital Content |
US7770013B2 (en) * | 1995-07-27 | 2010-08-03 | Digimarc Corporation | Digital authentication with digital and analog documents |
US7805500B2 (en) * | 1995-05-08 | 2010-09-28 | Digimarc Corporation | Network linking methods and apparatus |
US7920713B2 (en) * | 2004-12-20 | 2011-04-05 | Lsi Corporation | Recorded video broadcast, streaming, download, and disk distribution with watermarking instructions |
US7930546B2 (en) * | 1996-05-16 | 2011-04-19 | Digimarc Corporation | Methods, systems, and sub-combinations useful in media identification |
US7936900B2 (en) * | 1995-05-08 | 2011-05-03 | Digimarc Corporation | Processing data representing video and audio and methods related thereto |
US7953390B2 (en) * | 2000-03-28 | 2011-05-31 | Affinity Labs Of Texas, Llc | Method for content delivery |
US7961949B2 (en) * | 1995-05-08 | 2011-06-14 | Digimarc Corporation | Extracting multiple identifiers from audio and video content |
US7974439B2 (en) * | 1993-11-18 | 2011-07-05 | Digimarc Corporation | Embedding hidden auxiliary information in media |
US7987492B2 (en) * | 2000-03-09 | 2011-07-26 | Gad Liwerant | Sharing a streaming video |
US7987245B2 (en) * | 1995-07-27 | 2011-07-26 | Digimarc Corporation | Internet linking from audio |
US8023773B2 (en) * | 2000-12-21 | 2011-09-20 | Digimarc Corporation | Methods, apparatus and programs for generating and utilizing content signatures |
US8036418B2 (en) * | 2000-01-26 | 2011-10-11 | Digimarc Corporation | Systems and methods of managing audio and other media |
US8036420B2 (en) * | 1999-12-28 | 2011-10-11 | Digimarc Corporation | Substituting or replacing components in sound based on steganographic encoding |
US8041074B2 (en) * | 1998-04-16 | 2011-10-18 | Digimarc Corporation | Content indexing and searching using content identifiers and associated metadata |
US8041734B2 (en) * | 2005-11-10 | 2011-10-18 | Soundhound, Inc. | System and method for storing and retrieving non-text-based information |
US8051169B2 (en) * | 2000-03-18 | 2011-11-01 | Digimarc Corporation | Methods and systems useful in linking from objects to remote resources |
US8055588B2 (en) * | 1999-05-19 | 2011-11-08 | Digimarc Corporation | Digital media methods |
US8085976B2 (en) * | 2001-03-05 | 2011-12-27 | Digimarc Corporation | Digital watermarking video captured from airborne platforms |
US8095796B2 (en) * | 1999-05-19 | 2012-01-10 | Digimarc Corporation | Content identifiers |
US8094949B1 (en) * | 1994-10-21 | 2012-01-10 | Digimarc Corporation | Music methods and systems |
US8099403B2 (en) * | 2000-07-20 | 2012-01-17 | Digimarc Corporation | Content identification and management in content distribution networks |
US8108484B2 (en) * | 1999-05-19 | 2012-01-31 | Digimarc Corporation | Fingerprints and machine-readable codes combined with user characteristics to obtain content or information |
US8121342B2 (en) * | 2000-01-13 | 2012-02-21 | Digimarc Corporation | Associating metadata with media signals, and searching for media signals using metadata |
US8121843B2 (en) * | 2000-05-02 | 2012-02-21 | Digimarc Corporation | Fingerprint methods and systems for media signals |
US8126201B2 (en) * | 2000-09-11 | 2012-02-28 | Digimarc Corporation | Watermark decoding from streaming media |
US8131760B2 (en) * | 2000-07-20 | 2012-03-06 | Digimarc Corporation | Using object identifiers with content distribution |
US8165341B2 (en) * | 1998-04-16 | 2012-04-24 | Digimarc Corporation | Methods and apparatus to process imagery or audio content |
US8312168B2 (en) * | 2000-03-18 | 2012-11-13 | Digimarc Corporation | Methods for linking from objects to remote resources |
US8379908B2 (en) * | 1995-07-27 | 2013-02-19 | Digimarc Corporation | Embedding and reading codes on objects |
US8429205B2 (en) * | 1995-07-27 | 2013-04-23 | Digimarc Corporation | Associating data with media signals in media signal systems through auxiliary data steganographically embedded in the media signals |
US8462950B2 (en) * | 2000-04-27 | 2013-06-11 | Qualcomm Incorporated | System and method for extracting, decoding, and utilizing hidden data embedded in audio signals |
US8489598B2 (en) * | 1999-05-19 | 2013-07-16 | Digimarc Corporation | Methods and devices employing content identifiers |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6240365B1 (en) * | 1997-01-21 | 2001-05-29 | Frank E. Bunn | Automated vehicle tracking and service provision system |
DE19730363B4 (en) * | 1997-07-15 | 2011-08-11 | Telefonaktiebolaget Lm Ericsson (Publ) | Site-specific World Wide Web services in digital cellular communication networks |
US6236768B1 (en) * | 1997-10-14 | 2001-05-22 | Massachusetts Institute Of Technology | Method and apparatus for automated, context-dependent retrieval of information |
US6647257B2 (en) * | 1998-01-21 | 2003-11-11 | Leap Wireless International, Inc. | System and method for providing targeted messages based on wireless mobile location |
US6826598B1 (en) * | 1998-05-05 | 2004-11-30 | British Telecommunications Public Limited Company | Storage and retrieval of location based information in a distributed network of data storage devices |
US6199094B1 (en) * | 1998-06-05 | 2001-03-06 | International Business Machines Corp. | Protecting shared resources using mutex striping |
US6266668B1 (en) * | 1998-08-04 | 2001-07-24 | Dryken Technologies, Inc. | System and method for dynamic data-mining and on-line communication of customized information |
US6580914B1 (en) * | 1998-08-17 | 2003-06-17 | At&T Wireless Services, Inc. | Method and apparatus for automatically providing location-based information content on a wireless device |
US6434524B1 (en) * | 1998-09-09 | 2002-08-13 | One Voice Technologies, Inc. | Object interactive user interface using speech recognition and natural language processing |
WO2000022549A1 (en) * | 1998-10-09 | 2000-04-20 | Koninklijke Philips Electronics N.V. | Automatic inquiry method and system |
US6654891B1 (en) * | 1998-10-29 | 2003-11-25 | Nortel Networks Limited | Trusted network binding using LDAP (lightweight directory access protocol) |
US20030060211A1 (en) * | 1999-01-26 | 2003-03-27 | Vincent Chern | Location-based information retrieval system for wireless communication device |
US6847969B1 (en) * | 1999-05-03 | 2005-01-25 | Streetspace, Inc. | Method and system for providing personalized online services and advertisements in public spaces |
US6601026B2 (en) * | 1999-09-17 | 2003-07-29 | Discern Communications, Inc. | Information retrieval by natural language querying |
US20020120629A1 (en) * | 1999-10-29 | 2002-08-29 | Leonard Robert E. | Method and apparatus for information delivery on computer networks |
GB2360421B (en) * | 1999-11-10 | 2004-02-18 | Ibm | Transmission of geographic information to mobile devices |
US7050977B1 (en) * | 1999-11-12 | 2006-05-23 | Phoenix Solutions, Inc. | Speech-enabled server for internet website and method |
US6650902B1 (en) * | 1999-11-15 | 2003-11-18 | Lucent Technologies Inc. | Method and apparatus for wireless telecommunications system that provides location-based information delivery to a wireless mobile unit |
US7299405B1 (en) * | 2000-03-08 | 2007-11-20 | Ricoh Company, Ltd. | Method and system for information management to facilitate the exchange of ideas during a collaborative effort |
US6397206B1 (en) * | 1999-12-15 | 2002-05-28 | International Business Machines Corporation | Optimizing fixed, static query or service selection and execution based on working set hints and query signatures |
US6665658B1 (en) * | 2000-01-13 | 2003-12-16 | International Business Machines Corporation | System and method for automatically gathering dynamic content and resources on the world wide web by stimulating user interaction and managing session information |
US6405034B1 (en) * | 2000-01-28 | 2002-06-11 | Leap Wireless International, Inc. | Adaptive communication data retrieval system |
US20010034660A1 (en) * | 2000-02-09 | 2001-10-25 | Heinz Heumann | Goods and services referring by location |
FI112433B (en) * | 2000-02-29 | 2003-11-28 | Nokia Corp | Location-related services |
US6658389B1 (en) * | 2000-03-24 | 2003-12-02 | Ahmet Alpdemir | System, method, and business model for speech-interactive information system having business self-promotion, audio coupon and rating features |
US6564210B1 (en) * | 2000-03-27 | 2003-05-13 | Virtual Self Ltd. | System and method for searching databases employing user profiles |
US7213048B1 (en) * | 2000-04-05 | 2007-05-01 | Microsoft Corporation | Context aware computing devices and methods |
WO2001082031A2 (en) * | 2000-04-26 | 2001-11-01 | Portable Internet Inc. | Portable internet services |
US20030220917A1 (en) * | 2002-04-03 | 2003-11-27 | Max Copperman | Contextual search |
US7003972B2 (en) * | 2003-11-24 | 2006-02-28 | Lg Electronics Inc. | Indoor unit for air conditioner |
-
2001
- 2001-05-08 US US09/850,956 patent/US7228327B2/en not_active Expired - Lifetime
-
2007
- 2007-05-15 US US11/748,985 patent/US20070294064A1/en not_active Abandoned
-
2013
- 2013-01-29 US US13/753,243 patent/US20130142344A1/en not_active Abandoned
Patent Citations (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8073933B2 (en) * | 1993-11-18 | 2011-12-06 | Digimarc Corporation | Audio processing |
US8010632B2 (en) * | 1993-11-18 | 2011-08-30 | Digimarc Corporation | Steganographic encoding for video and images |
US7693965B2 (en) * | 1993-11-18 | 2010-04-06 | Digimarc Corporation | Analyzing audio, including analyzing streaming audio signals |
US7974439B2 (en) * | 1993-11-18 | 2011-07-05 | Digimarc Corporation | Embedding hidden auxiliary information in media |
US8094949B1 (en) * | 1994-10-21 | 2012-01-10 | Digimarc Corporation | Music methods and systems |
US8014563B2 (en) * | 1994-10-21 | 2011-09-06 | Digimarc Corporation | Methods and systems for steganographic processing |
US7724919B2 (en) * | 1994-10-21 | 2010-05-25 | Digimarc Corporation | Methods and systems for steganographic processing |
US7805500B2 (en) * | 1995-05-08 | 2010-09-28 | Digimarc Corporation | Network linking methods and apparatus |
US8068679B2 (en) * | 1995-05-08 | 2011-11-29 | Digimarc Corporation | Audio and video signal processing |
US7961949B2 (en) * | 1995-05-08 | 2011-06-14 | Digimarc Corporation | Extracting multiple identifiers from audio and video content |
US7936900B2 (en) * | 1995-05-08 | 2011-05-03 | Digimarc Corporation | Processing data representing video and audio and methods related thereto |
US7987245B2 (en) * | 1995-07-27 | 2011-07-26 | Digimarc Corporation | Internet linking from audio |
US20100009722A1 (en) * | 1995-07-27 | 2010-01-14 | Levy Kenneth L | Connected Audio and Other Media Objects |
US7770013B2 (en) * | 1995-07-27 | 2010-08-03 | Digimarc Corporation | Digital authentication with digital and analog documents |
US8315554B2 (en) * | 1995-07-27 | 2012-11-20 | Digimarc Corporation | Connected audio content |
US20100257069A1 (en) * | 1995-07-27 | 2010-10-07 | Levy Kenneth L | Connected Audio Content |
US8190713B2 (en) * | 1995-07-27 | 2012-05-29 | Digimarc Corporation | Controlling a device based upon steganographically encoded data |
US8521850B2 (en) * | 1995-07-27 | 2013-08-27 | Digimarc Corporation | Content containing a steganographically encoded process identifier |
US8429205B2 (en) * | 1995-07-27 | 2013-04-23 | Digimarc Corporation | Associating data with media signals in media signal systems through auxiliary data steganographically embedded in the media signals |
US7650010B2 (en) * | 1995-07-27 | 2010-01-19 | Digimarc Corporation | Connected video and audio |
US8224022B2 (en) * | 1995-07-27 | 2012-07-17 | Digimarc Corporation | Connected audio and other media objects |
US8379908B2 (en) * | 1995-07-27 | 2013-02-19 | Digimarc Corporation | Embedding and reading codes on objects |
US7930546B2 (en) * | 1996-05-16 | 2011-04-19 | Digimarc Corporation | Methods, systems, and sub-combinations useful in media identification |
US7751596B2 (en) * | 1996-11-12 | 2010-07-06 | Digimarc Corporation | Methods and arrangements employing digital content items |
US6570991B1 (en) * | 1996-12-18 | 2003-05-27 | Interval Research Corporation | Multi-feature speech/music discrimination system |
US8165341B2 (en) * | 1998-04-16 | 2012-04-24 | Digimarc Corporation | Methods and apparatus to process imagery or audio content |
US8041074B2 (en) * | 1998-04-16 | 2011-10-18 | Digimarc Corporation | Content indexing and searching using content identifiers and associated metadata |
US7562392B1 (en) * | 1999-05-19 | 2009-07-14 | Digimarc Corporation | Methods of interacting with audio and ambient music |
US8489598B2 (en) * | 1999-05-19 | 2013-07-16 | Digimarc Corporation | Methods and devices employing content identifiers |
US20100036881A1 (en) * | 1999-05-19 | 2010-02-11 | Rhoads Geoffrey B | Portable Audio Appliance |
US8155582B2 (en) * | 1999-05-19 | 2012-04-10 | Digimarc Corporation | Methods and systems employing digital content |
US8151113B2 (en) * | 1999-05-19 | 2012-04-03 | Digimarc Corporation | Methods and devices responsive to ambient audio |
US8160968B2 (en) * | 1999-05-19 | 2012-04-17 | Digimarc Corporation | Digital media methods |
US8126200B2 (en) * | 1999-05-19 | 2012-02-28 | Digimarc Corporation | Methods and systems employing digital content |
US20100138012A1 (en) * | 1999-05-19 | 2010-06-03 | Rhoads Geoffrey B | Methods and Devices Responsive to Ambient Audio |
US7965864B2 (en) * | 1999-05-19 | 2011-06-21 | Digimarc Corporation | Data transmission by extracted or calculated identifying data |
US8055588B2 (en) * | 1999-05-19 | 2011-11-08 | Digimarc Corporation | Digital media methods |
US7966494B2 (en) * | 1999-05-19 | 2011-06-21 | Digimarc Corporation | Visual content-based internet search methods and sub-combinations |
US20100150395A1 (en) * | 1999-05-19 | 2010-06-17 | Davis Bruce L | Data Transmission by Extracted or Calculated Identifying Data |
US8255693B2 (en) * | 1999-05-19 | 2012-08-28 | Digimarc Corporation | Methods and devices responsive to ambient audio |
US20100046744A1 (en) * | 1999-05-19 | 2010-02-25 | Rhoads Geoffrey B | Methods and Devices Responsive to Ambient Audio |
US8095796B2 (en) * | 1999-05-19 | 2012-01-10 | Digimarc Corporation | Content identifiers |
US20100185306A1 (en) * | 1999-05-19 | 2010-07-22 | Rhoads Geoffrey B | Methods and Systems Employing Digital Content |
US8200976B2 (en) * | 1999-05-19 | 2012-06-12 | Digimarc Corporation | Portable audio appliance |
US8108484B2 (en) * | 1999-05-19 | 2012-01-31 | Digimarc Corporation | Fingerprints and machine-readable codes combined with user characteristics to obtain content or information |
US8122257B2 (en) * | 1999-05-19 | 2012-02-21 | Digimarc Corporation | Audio-based, location-related methods |
US20100322035A1 (en) * | 1999-05-19 | 2010-12-23 | Rhoads Geoffrey B | Audio-Based, Location-Related Methods |
US8036420B2 (en) * | 1999-12-28 | 2011-10-11 | Digimarc Corporation | Substituting or replacing components in sound based on steganographic encoding |
US8121342B2 (en) * | 2000-01-13 | 2012-02-21 | Digimarc Corporation | Associating metadata with media signals, and searching for media signals using metadata |
US8036418B2 (en) * | 2000-01-26 | 2011-10-11 | Digimarc Corporation | Systems and methods of managing audio and other media |
US8107674B2 (en) * | 2000-02-04 | 2012-01-31 | Digimarc Corporation | Synchronizing rendering of multimedia content |
US20100172540A1 (en) * | 2000-02-04 | 2010-07-08 | Davis Bruce L | Synchronizing Rendering of Multimedia Content |
US7987492B2 (en) * | 2000-03-09 | 2011-07-26 | Gad Liwerant | Sharing a streaming video |
US8051169B2 (en) * | 2000-03-18 | 2011-11-01 | Digimarc Corporation | Methods and systems useful in linking from objects to remote resources |
US8312168B2 (en) * | 2000-03-18 | 2012-11-13 | Digimarc Corporation | Methods for linking from objects to remote resources |
US7953390B2 (en) * | 2000-03-28 | 2011-05-31 | Affinity Labs Of Texas, Llc | Method for content delivery |
US8532641B2 (en) * | 2000-03-28 | 2013-09-10 | Affinity Labs Of Texas, Llc | System and method for managing media |
US8359007B2 (en) * | 2000-03-28 | 2013-01-22 | Affinity Labs Of Texas, Llc | System and method for communicating media center |
US8521140B2 (en) * | 2000-03-28 | 2013-08-27 | Affinity Labs Of Texas, Llc | System and method for communicating media content |
US7970379B2 (en) * | 2000-03-28 | 2011-06-28 | Affinity Labs Of Texas, Llc | Providing broadcast content |
US8462950B2 (en) * | 2000-04-27 | 2013-06-11 | Qualcomm Incorporated | System and method for extracting, decoding, and utilizing hidden data embedded in audio signals |
US8121843B2 (en) * | 2000-05-02 | 2012-02-21 | Digimarc Corporation | Fingerprint methods and systems for media signals |
US8099403B2 (en) * | 2000-07-20 | 2012-01-17 | Digimarc Corporation | Content identification and management in content distribution networks |
US8131760B2 (en) * | 2000-07-20 | 2012-03-06 | Digimarc Corporation | Using object identifiers with content distribution |
US8126201B2 (en) * | 2000-09-11 | 2012-02-28 | Digimarc Corporation | Watermark decoding from streaming media |
US8023773B2 (en) * | 2000-12-21 | 2011-09-20 | Digimarc Corporation | Methods, apparatus and programs for generating and utilizing content signatures |
US8077911B2 (en) * | 2000-12-21 | 2011-12-13 | Digimarc Corporation | Methods, apparatus and programs for generating and utilizing content signatures |
US8488836B2 (en) * | 2000-12-21 | 2013-07-16 | Digimarc Corporation | Methods, apparatus and programs for generating and utilizing content signatures |
US8085976B2 (en) * | 2001-03-05 | 2011-12-27 | Digimarc Corporation | Digital watermarking video captured from airborne platforms |
US7920713B2 (en) * | 2004-12-20 | 2011-04-05 | Lsi Corporation | Recorded video broadcast, streaming, download, and disk distribution with watermarking instructions |
US8041734B2 (en) * | 2005-11-10 | 2011-10-18 | Soundhound, Inc. | System and method for storing and retrieving non-text-based information |
Also Published As
Publication number | Publication date |
---|---|
US20020059370A1 (en) | 2002-05-16 |
US20070294064A1 (en) | 2007-12-20 |
US7228327B2 (en) | 2007-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7228327B2 (en) | Method and apparatus for delivering content via information retrieval devices | |
US11100175B2 (en) | Method of and system for conducting personalized federated search and presentation of results therefrom | |
KR100478019B1 (en) | Method and system for generating a search result list based on local information | |
US8990182B2 (en) | Methods and apparatus for searching the Internet | |
US8666963B2 (en) | Method and apparatus for processing spoken search queries | |
US8798583B2 (en) | Tag ticker display on a mobile device | |
US20070185843A1 (en) | Automated tool for human assisted mining and capturing of precise results | |
US20140207749A1 (en) | Method and System for Incrementally Selecting and Providing Relevant Search Engines in Response to a User Query | |
US20090006962A1 (en) | Audio thumbnail | |
US20120060113A1 (en) | Methods and apparatus for displaying content | |
US20120059658A1 (en) | Methods and apparatus for performing an internet search | |
KR20060006913A (en) | A system for generating search results including searching by subdomain hints and providing sponsored results by subdomain | |
JP2009140477A (en) | Device and method for service proposition, system for service proposition, and device and method for service proposition based on user's favorite base | |
MX2008009454A (en) | Targeted mobile device advertisements. | |
US7853606B1 (en) | Alternate methods of displaying search results | |
JP2009070157A (en) | Information retrieval system and information retrieval method | |
KR100909561B1 (en) | System for generating a search result list based on local information | |
WO2009001139A1 (en) | Audio thumbnail | |
EP2732389A2 (en) | Methods and apparatus for identifying and providing information sought by a user |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTELLECTUAL VENTURES I LLC, DELAWARE Free format text: MERGER;ASSIGNOR:HOSHIKO LLC;REEL/FRAME:030639/0289 Effective date: 20130523 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |