WO2016080969A1 - User interface overlay - Google Patents

User interface overlay Download PDF

Info

Publication number
WO2016080969A1
WO2016080969A1 PCT/US2014/066175 US2014066175W WO2016080969A1 WO 2016080969 A1 WO2016080969 A1 WO 2016080969A1 US 2014066175 W US2014066175 W US 2014066175W WO 2016080969 A1 WO2016080969 A1 WO 2016080969A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sequence
action
action trigger
user actions
Prior art date
Application number
PCT/US2014/066175
Other languages
French (fr)
Inventor
Eitan KATZ
Tomer PRIEL
Adi Kidron
Original Assignee
Hewlett Packard Enterprise Development Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Enterprise Development Lp filed Critical Hewlett Packard Enterprise Development Lp
Priority to PCT/US2014/066175 priority Critical patent/WO2016080969A1/en
Publication of WO2016080969A1 publication Critical patent/WO2016080969A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Application users interact with controls presented in a graphical user interface.
  • the interaction can take a variety of forms including touch, gesture, and the use of devices such as mice and keyboards.
  • Successful adoption of an application can depend heavily on the application's actual and perceived performance. Perceived performance is dictated by user experience. User experience is heavily impacted by the application's responsiveness.
  • Fig. 1 depicts an example user interface
  • FIGs. 2 and 3 depict examples in which an interface overlay action has been instigated with respect to the user interface of Fig. 1 .
  • FIG. 4 depicts an example environment in which embodiments may be implemented
  • FIG. 5 is a block diagram depicting an example of an interface overlay system.
  • Fig. 6 is a block diagram depicting a memory resource and a processing resource according to an example.
  • Figs. 7 and 8 are flow diagrams depicting actions taken to implement examples.
  • an application may be continually updated to add and refine features and functionality.
  • developers may introduce unexpected performance issues.
  • a new feature may cause the application to appear non-responsive, while the addition of multiple features may leave users hunting for or unaware of new functionality.
  • users may consume a new feature by interacting with a designated interface control that instructs the application to take a specified. That action may be performed by a backend process and take a discernable amount of time.
  • the user interface is not updated to indicate that the backend is processing the user request, the user can get frustrated and repeatedly interact with that control to no apparent avail.
  • a user may traipse through the user interface to find a new function, in doing so, the user continually select menu items until the user finds the function or stops looking.
  • Figs. 1 -3 serve as a contextual example.
  • Fig. 1 depicts a screen view of user interface 10.
  • Interface 10 includes a series of menu controls 12-18.
  • menu control 16 has been selected resulting in the display of functional controls 20-28.
  • a user is able to add data to a number of text boxes 20-24 and then commit a selected action using control buttons 26 and 28.
  • a backend process takes information supplied in text boxes 20-24 to perform a function such as building and executing a query. While the backend process is executing, user interface 10 may be updated to show a "busy" or "working" indicator. Where a developer has not yet included such an indicator, a user is often tempted to repeatedly select control 30 to make sure the application registered the selection.
  • Embodiments described below operate to detect potentially negative usage patterns such as this, assign a user action trigger, and associate that trigger with an interface overlay action.
  • a user action trigger that includes a specified number of consecutive selections (or rate of selection) of control 28 has been identified based on a pattern of usage of interface 10. That user action trigger is then associated with a particular user interface overlay action. Following detection of the user action trigger on a computing device, the associated overlay action is instigated causing the display of interface overlay 30.
  • overlay 30 provides an indication to that user that the application registered the selection of control 28 and that a backend process is running. This user action trigger may remain associated with the interface overlay action until the application is updated to include its own appropriate indicator.
  • the usage pattern at issue included the repeated selection of a single interface control indicating user is not aware if the application has registered the selection. Other user patterns can also indicate potential user frustration or confusion.
  • a user may find it difficult to identify a desired feature that may be accessed using a particular menu. Searching for the feature, the user may repeatedly select different menu controls 12-18.
  • the usage pattern may include a consecutive number of selections of a type of interface control or a rapid rate of such selections that indicate the user may be having difficulty finding a feature.
  • the usage pattern and its corresponding user action trigger may be the same and associated with an interface overlay action for offering assistance.
  • instigation of the interface overlay action results in the display of overlay 32.
  • Fig. 4 depicts an environment 34 in which various examples may be implemented as interface overlay system 36. in addition to system 36, environment 34 is shown to include server device 38 and user computing devices 40-44 and depicted as a smart phone 40, a notebook computer 42, and a tablet computer 44. Computing devices 40-44 are intended to represent devices capable of presenting a user interface for interacting with an application being monitored.
  • Server device 38 represents computing devices capable of implementing interface overlay system 36. Server device 38 may also be responsible for serving the monitored application to user computing devices 40-44. Server device 38 is capable of interaction with user computing devices 40-44 via link 46.
  • Link 46 represents generally any infrastructure or combination of infrastructures, wired and wireless, configured to enable electronic data communication between components 38-44.
  • link 46 may represent the internet, one or more intranets, and any intermediate routers, switches, and other interfaces.
  • System 36 represents a combination of hardware and programming configured to identify negative usage patterns for an application and corresponding user action triggers.
  • System 36 to associates the user action triggers with corresponding interface overlay actions.
  • system 36 Upon detecting an occurrence of a corresponding user action trigger on a given computing device 40-44, system 36 serves to instigate the trigger's associated overlay action on that computing device 40-44.
  • Figs. 5-6 depict examples of physical and logical components for implementing interface overlay system 36.
  • various components are identified as engines 48-54.
  • engines 48-54 focus is on each engine's designated function.
  • the term engine refers to a combination of hardware and programming configured to perform a designated function.
  • the hardware of each engine includes one or both of a processor and a memory device, while the programing is code stored on that memory device and executable by the processor to perform the designated function.
  • Fig. 5 is a block diagram depicting components of interface overlay system 36.
  • system 36 includes monitor engine 48, analysis engine 50, association engine 52, and instigation engine 54.
  • engines 48-54 may access data repository 56.
  • Repository 56 represents generally any memory accessible to system 36 to which data can be stored and from which data can be retrieved.
  • Monitor engine 48 is configured to process monitoring data 58 collected for an application and to process that monitoring data 58 to identify user actions occurring on a plurality of computing devices with respect to that application.
  • the monitoring data 58 may be collected by monitor engine 48 or by a component external to system 36.
  • Monitoring data 58 is reflective of user actions with the application's user interface. For each of a number of users and user computing devices, the monitoring data, for example, may identify the application, the user, the user's device, and the a sequence of user interactions with the interface and its various elements. Each sequence may include timing data that can be used, to determine how rapidly an interface control or set of controls are being selected.
  • monitoring engine 48 may process first monitoring data to identify first user actions.
  • Analysis engine 50 is configured recognize a sequence of user actions from the identified first user actions that is repeated with respect to two or more of the plurality of computing devices.
  • the identified sequence of user actions may be a negative usage pattern that indicates a performance issue for the
  • the recognized sequence of user actions is indicative of a negative usage pattern.
  • the pattern is negative in that it may reflect a performance issue for the application.
  • Such a sequence can include a threshold number of consecutive interactions with a single user interface control. It can include interactions with an interface control of a specified type exceeding a threshold interaction rate. Repeated selection of a control more than three times over a two second interval is an example.
  • An identified sequence may include a threshold number of consecutive interactions with user interface controls selected from a set of user interface control types. As an example, the identified sequence may include repetitive selection from a set of menu controls.
  • An identified sequence may include interactions, exceeding a threshold interaction rate, with user interface controls selected from a set of user interface controls. Repeated selection of any menu control more than five times over a ten second interval is an example.
  • Analysis engine 50 may also be responsible for examining a recognized sequence to identify a corresponding user action trigger.
  • a user action trigger is user action or series of user actions taken from a recognized sequence. Stated another way, an occurrence of a user action trigger is an indicator that a recognized sequence may be or has been repeated.
  • a user action trigger for a sequence that includes the repeated selection of a single interface control may be a single selection of that same control, it may be a threshold number of selections of that control, or it may be a rate of selection for that particular control.
  • a user action trigger for a sequence that includes repeated selection of type of interface control may include a threshold number of consecutive selections of controls of that type or a rate of selection for that control type.
  • analysis engine 50 may identify an existing representation of that trigger from existing trigger data 60. Where the there is no existing representation, analysis engine 50 may update trigger data 60 accordingly.
  • monitoring engine 48 may process second monitoring data to identify second user actions.
  • the term "second" is used in this instance to reflect that the second monitoring data is collected subsequent to the first monitoring data and that the second user actions occurred subsequent to the first user actions.
  • Analysis engine 50 is configured to analyze the identified second user actions to identify occurrences of a user action trigger. In the Example of Fig. 5, analysis engine 50 may process the identified second user actions against trigger data 60 to identify occurrences of user action triggers represented by trigger data 60.
  • Association engine 52 is configured to correlate the recognized sequence of user actions with the user action trigger and to associate the user action trigger with an interface overlay action.
  • An interface overlay action is an action, that when performed, results in the display of an overlay on a user interface of an affected computing device that provides information to the user. For example, that information may indicate that the application is busy or it may convey help to a user that appears to be seeking an application feature.
  • overlay action data 62 includes data for use in instigating a number of interface overlay actions.
  • Association engine 52 may be responsible for updating overlay action data 62 to establish a link between each such interface overlay action representation and one or more user action triggers represented by trigger data 60.
  • Instigation engine 54 is configured to instigate the interface overlay action for each of the plurality of computing devices device for which analysis engine 50 has identified an occurrence of the user action trigger. To do so, instigation engine 54 may query overlay action data 62 to identify a particular interface overlay action associated with that user action trigger. In performing its function, instigation engine 54 may directly or indirectly interact with the operating system of each such computing device to cause the display of the corresponding overlay. Instigation engine 54 may interact directly or indirectly with the application's web server to cause that server to supply the computing device with data that causes the device to display the corresponding overlay.
  • analysis engine 50 is responsible for identifying sequences of user actions for an application that are repeated by users of different devices and indicative of a negative usage pattern. When the sequence is repeated some threshold number of times, a presumption can be made that the negative usage pattern is the result of a functional issue with the application that should be addressed by a developer.
  • Analysis engine 50 may be responsible for causing an entry reflective of the functional issue to a tracking tool.
  • the tracking tool for example, may be a tool for tracking application defects to be considered and addressed by an application developer. Once addressed, the entry is closed or otherwise updated.
  • association engine 52 may then disassociate the interface overlay action from the user action trigger for the sequence associated with that performance issue. In the example of Fig. 5, such may include updating either or both of overlay action data 62 and trigger data 60.
  • engines 48-54 were described as combinations of hardware and programming. Engines 48-546 may be implemented in a number of fashions. Looking at Fig. 6, the programming may be processor executable instructions stored on tangible memory resource 64, and the hardware may include processing resource 66 for executing those instructions. Thus memory resource 64 can be said to store program instructions that when executed by processing resource 66 implements system 36 of Fig. 5.
  • Memory resource 64 represents generally any number of memory components capable of storing instructions that can be executed by processing resource 66.
  • Memory resource 64 is non-transitory in the sense that it does not encompass a transitory signal but instead is made up of one or more memory components configured to store the relevant instructions.
  • Memory resource 64 may be implemented in a single device or distributed across devices.
  • processing resource 66 represents any number of processors capable of executing instructions stored by memory resource 64.
  • Processing resource 58 may be integrated in a single device or distributed across devices. Further, memory resource 64 may be fully or partially integrated in the same device as processing resource 66, or it may be separate but accessible to that device and processing resource 66.
  • the program instructions can be part of an installation package that when installed can be executed by processing resource 66 to implement system 36.
  • memory resource 64 may be a portable medium such as a CD, DVD, or flash drive or a memory maintained by a server from which the installation package can be downloaded and installed.
  • the program instructions may be part of an application or applications already installed.
  • memory resource 64 can include integrated memory such as a hard drive, solid state drive, or the like.
  • FIG. 6 the executable program instructions stored in memory resource 64 are depicted as monitor, analysis, association, and instigation modules 68-74 respectively.
  • Monitoring module 68 and analysis module 70 represent program instructions that, when executed, cause processing resource 66 to implement monitor engine 48 and analysis engine 50 respectively.
  • association module 72 and instigation module 74 represent program instructions that when executed cause the
  • association engine 52 and instigation engine 54 respectively.
  • FIGs. 7 and 8 are flow diagrams of actions taken to implement an example of an interface overlay method.
  • Fig. 7 covers a first phase in which user action sequences indicative of performance issues are identified and associated with corresponding user action triggers and interface overlay actions.
  • Fig. 8 addresses a second phase in which overlay actions are instigated following detection of associated user action triggers.
  • first monitoring data is processed to identify user actions occurring on a plurality of computing devices with respect to a particular application (block 76).
  • Such monitoring data includes information identifying a user's interactions with the application's user interface occurring on each of the plurality of computing devices.
  • the identified user actions can include interactions individual interface controls, interactions with categories or types of controls, and interactions with the interface in general such as mouse movements, keystrokes, touch screen gestures and the like.
  • monitor engine 48 may be responsible for block 76.
  • the monitoring data may be collected by monitor engine 48 or by a component independent of system 36.
  • a sequence of the user actions is identified (block 78).
  • the identified sequence is one that is repeated with respect to two or more of the plurality of devices and indicates a
  • analysis engine 50 may be responsible for implementing block 78.
  • a user action trigger is identified and associated with an interface overlay action for the indicated performance issue (block 80).
  • the user action trigger is user action or series of user actions taken from the sequence recognized in block 78. Stated another way, an occurrence of the identified user action trigger is an indicator that the recognized sequence may be or has been repeated.
  • analysis engine 50 may be responsible for identifying the user action trigger, while association engine52 may be responsible for associating that trigger with the interface overlay action.
  • the sequence of user events identified in block 78 may include a threshold number of selections of a user interface control or control type occurring within a specified timeframe.
  • the trigger identified in block 80 may then be a set number of selections of the user interface control.
  • the sequence of user events identified in block 78 may include a number of user interface control selections from a set of user interface controls occurring within a specified timeframe.
  • the set may encompass a type of interface control.
  • the trigger identified in block 78 may include a specified number of consecutive user interface control selections of from the set of user interface controls as the user action trigger.
  • the blocks of Fig. 7 depict a first phase in which monitoring data collected from a plurality of computing devices is analyzed to identify a repeated sequence of user actions that is indicative of a negative usage pattern. In other words, a repeated sequence that is indicative of an application performance issue.
  • the process provides a framework for improving the performance of the application.
  • the second phase depicted in Fig. 8 utilizes that framework to address a performance issue without requiring action from an application developer. Such provides improved application performance for at least a period of time while users await an application update.
  • second monitoring data is processed to identify occurrences of a user action trigger on the plurality of computing devices (block 82).
  • the term "second" is used here to indicate that the monitoring data being processed has been collected subsequently to the collection of the "first" monitoring data processed in block 76 of Fig. 7.
  • the trigger for example, may be the trigger identified in block 80 of Fig. 7.
  • monitor engine 48 may be responsible for processing the second monitoring data to identify user actions occurring on the plurality of computing devices with respect to a given application.
  • Analysis engine 50 may then be responsible for analyzing those user actions to identify occurrences of the user action trigger.
  • the interface overlay action associated with that trigger is instigated for that computing device (block 84).
  • Instigating can include interacting with a server responsible for providing user interface content to that mobile device to cause that server to augment that content with content for causing the display of the
  • instigating may include directly or indirectly interacting with the operating system of the given computing device do cause the display of that interface overlay.
  • instigation engine 54 may be responsible for implementing block 84.
  • the sequence identified in step 78 is indicative of a performance issue for an application. Such an issue is typically addressed by the application's developer when releasing subsequent application updates. Beneficially, the process laid out in Figs. 7 and 8 may serve to identify functional issues early and in an automated fashion. Thus, upon identifying sequence that is indicative of a negative usage pattern
  • the process of Fig. 7 can be extended to include automatically adding an entry to a tracking tool, the entry indicative of the indicated performance issue.
  • the tracking tool may be any software based tool that tracks application defects through a defined lifecycle.
  • an entry is closed or otherwise addressed. Such may be the result of an application update or a determination that there is not a problem to address.
  • association engine 52 disassociating the trigger and interface overlay action.
  • an interface overlay action that had been instigated in block 84 based on the detection of the action trigger will no longer be.
  • Figs. 1-3 depict example screen views of a user interface intended to help visualize an examples in which interface overlay actions may be instigated.
  • the particular layout of the user interface, its controls and the particular overlays depicted are examples only.
  • Figs. 4-6 aid in depicting the architecture, functionality, and operation of various embodiments.
  • Figs. 4-6 depict various physical and logical components.
  • Various components are defined at least in part as programs or programming. Each such component, portion thereof, or various combinations thereof may represent in whole or in part a module, segment, or portion of code that comprises one or more executable instructions to implement any specified logical function(s).
  • Each component or various combinations thereof may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
  • Embodiments can be realized in any memory resource for use by or in connection with processing resource.
  • a "processing resource” is an instruction execution system such as a computer/processor based system or an ASIC (Application Specific Integrated Circuit) or other system that can fetch or obtain instructions and data from computer-readable media and execute the instructions contained therein.
  • a "memory resource” is any non- transitory storage media that can contain, store, or maintain programs and data for use by or in connection with the instruction execution system. The term “non-transitory is used only to clarify that the term media, as used herein, does not encompass a signal.
  • the memory resource can comprise any one of many physical media such as, for example, electronic, magnetic, optical, electromagnetic, or semiconductor media. More specific examples of suitable computer-readable media include, but are not limited to, hard drives, solid state drives, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory, flash drives, and portable compact discs.
  • Figs. 7-8 shows a specific orders of execution
  • the orders of execution may differ from that which is depicted.
  • the order of execution of two or more blocks or arrows may be scrambled relative to the order shown.
  • two or more blocks shown in succession may be executed concurrently or with partial concurrence. All such variations are within the scope of the present invention.

Abstract

An interface overlay method includes processing first monitoring data to identify user actions occurring on a plurality of computing devices with respect to a particular application. From the processing, a sequence of the user actions is identified where that sequence is indicative of performance issue and is repeated with respect to two or more of the plurality of devices. A user action trigger for the identified sequence is then identified and associated an interface overlay action intended to address the indicated performance issue.

Description

USER INTERFACE OVERLAY
BACKGROUND
[0001] Application users interact with controls presented in a graphical user interface. The interaction can take a variety of forms including touch, gesture, and the use of devices such as mice and keyboards. Successful adoption of an application can depend heavily on the application's actual and perceived performance. Perceived performance is dictated by user experience. User experience is heavily impacted by the application's responsiveness.
DRAWINGS
[0002] Fig. 1 depicts an example user interface
[0003] Figs. 2 and 3 depict examples in which an interface overlay action has been instigated with respect to the user interface of Fig. 1 .
[0004] Fig. 4 depicts an example environment in which embodiments may be implemented
[0005] Fig. 5 is a block diagram depicting an example of an interface overlay system.
[0006] Fig. 6 is a block diagram depicting a memory resource and a processing resource according to an example.
[0007] Figs. 7 and 8 are flow diagrams depicting actions taken to implement examples. DETAILED DESCRIPTION
[0008] INTRODUCTION: Over its lifecycie, an application may be continually updated to add and refine features and functionality. When doing so, developers may introduce unexpected performance issues. As example, a new feature may cause the application to appear non-responsive, while the addition of multiple features may leave users hunting for or unaware of new functionality. In the first instance, users may consume a new feature by interacting with a designated interface control that instructs the application to take a specified. That action may be performed by a backend process and take a discernable amount of time. In cases where the user interface is not updated to indicate that the backend is processing the user request, the user can get frustrated and repeatedly interact with that control to no apparent avail. With the second example, a user may traipse through the user interface to find a new function, in doing so, the user continually select menu items until the user finds the function or stops looking.
[0009] In both examples above, the user's perception of application performance can suffer. Updating the application to address such performance issue takes time leaving the application at risk for losing users. Embodiments, described below have been developed to serve as an application performance patch that can be applied on an individual user basis. User actions with the application are monitored. These include the users' interactions with the application user interface across a plurality of computing devices. Certain sequences of user action, such as repeatedly selecting a control, are indicative of a potential performance issue. These sequences are identified as user action triggers and associated with an interface overlay action. When, through an analysis of the monitoring data, a user action trigger is detected for an individual computing device, the trigger's associated interface overlay action is instigated.
[0010] Figs. 1 -3 serve as a contextual example. Fig. 1 depicts a screen view of user interface 10. Interface 10 includes a series of menu controls 12-18. In this example, menu control 16 has been selected resulting in the display of functional controls 20-28. A user is able to add data to a number of text boxes 20-24 and then commit a selected action using control buttons 26 and 28. Upon selecting control 28, a backend process takes information supplied in text boxes 20-24 to perform a function such as building and executing a query. While the backend process is executing, user interface 10 may be updated to show a "busy" or "working" indicator. Where a developer has not yet included such an indicator, a user is often tempted to repeatedly select control 30 to make sure the application registered the selection. Embodiments described below, operate to detect potentially negative usage patterns such as this, assign a user action trigger, and associate that trigger with an interface overlay action.
[0011] Moving to Fig. 2, a user action trigger that includes a specified number of consecutive selections (or rate of selection) of control 28 has been identified based on a pattern of usage of interface 10. That user action trigger is then associated with a particular user interface overlay action. Following detection of the user action trigger on a computing device, the associated overlay action is instigated causing the display of interface overlay 30. In the example of Fig. 2, overlay 30 provides an indication to that user that the application registered the selection of control 28 and that a backend process is running. This user action trigger may remain associated with the interface overlay action until the application is updated to include its own appropriate indicator.
[0012] In Fig. 2, the usage pattern at issue included the repeated selection of a single interface control indicating user is not aware if the application has registered the selection. Other user patterns can also indicate potential user frustration or confusion. Looking at Fig. 3, a user may find it difficult to identify a desired feature that may be accessed using a particular menu. Searching for the feature, the user may repeatedly select different menu controls 12-18. Here the usage pattern may include a consecutive number of selections of a type of interface control or a rapid rate of such selections that indicate the user may be having difficulty finding a feature. In this example, the usage pattern and its corresponding user action trigger may be the same and associated with an interface overlay action for offering assistance. Here, instigation of the interface overlay action results in the display of overlay 32.
[0013] COMPONENTS: Fig. 4 depicts an environment 34 in which various examples may be implemented as interface overlay system 36. in addition to system 36, environment 34 is shown to include server device 38 and user computing devices 40-44 and depicted as a smart phone 40, a notebook computer 42, and a tablet computer 44. Computing devices 40-44 are intended to represent devices capable of presenting a user interface for interacting with an application being monitored.
[0014] Server device 38 represents computing devices capable of implementing interface overlay system 36. Server device 38 may also be responsible for serving the monitored application to user computing devices 40-44. Server device 38 is capable of interaction with user computing devices 40-44 via link 46. Link 46 represents generally any infrastructure or combination of infrastructures, wired and wireless, configured to enable electronic data communication between components 38-44. For example, link 46 may represent the internet, one or more intranets, and any intermediate routers, switches, and other interfaces.
[0015] System 36, discussed in more detail below, represents a combination of hardware and programming configured to identify negative usage patterns for an application and corresponding user action triggers. System 36 to associates the user action triggers with corresponding interface overlay actions. Upon detecting an occurrence of a corresponding user action trigger on a given computing device 40-44, system 36 serves to instigate the trigger's associated overlay action on that computing device 40-44.
[0016] Figs. 5-6 depict examples of physical and logical components for implementing interface overlay system 36. In Fig. 5 various components are identified as engines 48-54. In describing engines 48-54, focus is on each engine's designated function. However, the term engine, as used herein, refers to a combination of hardware and programming configured to perform a designated function. As is illustrated later with respect to Fig. 6, the hardware of each engine includes one or both of a processor and a memory device, while the programing is code stored on that memory device and executable by the processor to perform the designated function.
[0017] Fig. 5 is a block diagram depicting components of interface overlay system 36. In this example, system 36 includes monitor engine 48, analysis engine 50, association engine 52, and instigation engine 54. In performing their respective functions, engines 48-54 may access data repository 56. Repository 56 represents generally any memory accessible to system 36 to which data can be stored and from which data can be retrieved.
[0018] Monitor engine 48 is configured to process monitoring data 58 collected for an application and to process that monitoring data 58 to identify user actions occurring on a plurality of computing devices with respect to that application. The monitoring data 58 may be collected by monitor engine 48 or by a component external to system 36. Monitoring data 58 is reflective of user actions with the application's user interface. For each of a number of users and user computing devices, the monitoring data, for example, may identify the application, the user, the user's device, and the a sequence of user interactions with the interface and its various elements. Each sequence may include timing data that can be used, to determine how rapidly an interface control or set of controls are being selected.
[0019] In performing its function, monitoring engine 48 may process first monitoring data to identify first user actions. Analysis engine 50 is configured recognize a sequence of user actions from the identified first user actions that is repeated with respect to two or more of the plurality of computing devices. The identified sequence of user actions may be a negative usage pattern that indicates a performance issue for the
application. In an example, the recognized sequence of user actions is indicative of a negative usage pattern. The pattern is negative in that it may reflect a performance issue for the application. Such a sequence can include a threshold number of consecutive interactions with a single user interface control. It can include interactions with an interface control of a specified type exceeding a threshold interaction rate. Repeated selection of a control more than three times over a two second interval is an example. An identified sequence may include a threshold number of consecutive interactions with user interface controls selected from a set of user interface control types. As an example, the identified sequence may include repetitive selection from a set of menu controls. An identified sequence may include interactions, exceeding a threshold interaction rate, with user interface controls selected from a set of user interface controls. Repeated selection of any menu control more than five times over a ten second interval is an example.
[0020] Analysis engine 50 may also be responsible for examining a recognized sequence to identify a corresponding user action trigger. A user action trigger is user action or series of user actions taken from a recognized sequence. Stated another way, an occurrence of a user action trigger is an indicator that a recognized sequence may be or has been repeated. For example, a user action trigger for a sequence that includes the repeated selection of a single interface control may be a single selection of that same control, it may be a threshold number of selections of that control, or it may be a rate of selection for that particular control. A user action trigger for a sequence that includes repeated selection of type of interface control may include a threshold number of consecutive selections of controls of that type or a rate of selection for that control type. In identifying a user action t igger, analysis engine 50 may identify an existing representation of that trigger from existing trigger data 60. Where the there is no existing representation, analysis engine 50 may update trigger data 60 accordingly.
[0021] In performing its function, monitoring engine 48 may process second monitoring data to identify second user actions. The term "second" is used in this instance to reflect that the second monitoring data is collected subsequent to the first monitoring data and that the second user actions occurred subsequent to the first user actions. Analysis engine 50 is configured to analyze the identified second user actions to identify occurrences of a user action trigger. In the Example of Fig. 5, analysis engine 50 may process the identified second user actions against trigger data 60 to identify occurrences of user action triggers represented by trigger data 60.
[0022] Association engine 52 is configured to correlate the recognized sequence of user actions with the user action trigger and to associate the user action trigger with an interface overlay action. An interface overlay action is an action, that when performed, results in the display of an overlay on a user interface of an affected computing device that provides information to the user. For example, that information may indicate that the application is busy or it may convey help to a user that appears to be seeking an application feature. In the example of Fig. 5, overlay action data 62 includes data for use in instigating a number of interface overlay actions. Association engine 52 may be responsible for updating overlay action data 62 to establish a link between each such interface overlay action representation and one or more user action triggers represented by trigger data 60.
[0023] Instigation engine 54 is configured to instigate the interface overlay action for each of the plurality of computing devices device for which analysis engine 50 has identified an occurrence of the user action trigger. To do so, instigation engine 54 may query overlay action data 62 to identify a particular interface overlay action associated with that user action trigger. In performing its function, instigation engine 54 may directly or indirectly interact with the operating system of each such computing device to cause the display of the corresponding overlay. Instigation engine 54 may interact directly or indirectly with the application's web server to cause that server to supply the computing device with data that causes the device to display the corresponding overlay.
[0024] As noted above, analysis engine 50 is responsible for identifying sequences of user actions for an application that are repeated by users of different devices and indicative of a negative usage pattern. When the sequence is repeated some threshold number of times, a presumption can be made that the negative usage pattern is the result of a functional issue with the application that should be addressed by a developer. Analysis engine 50 may be responsible for causing an entry reflective of the functional issue to a tracking tool. The tracking tool, for example, may be a tool for tracking application defects to be considered and addressed by an application developer. Once addressed, the entry is closed or otherwise updated. Upon detecting the update, association engine 52 may then disassociate the interface overlay action from the user action trigger for the sequence associated with that performance issue. In the example of Fig. 5, such may include updating either or both of overlay action data 62 and trigger data 60.
[0025] In the foregoing discussion, engines 48-54 were described as combinations of hardware and programming. Engines 48-546 may be implemented in a number of fashions. Looking at Fig. 6, the programming may be processor executable instructions stored on tangible memory resource 64, and the hardware may include processing resource 66 for executing those instructions. Thus memory resource 64 can be said to store program instructions that when executed by processing resource 66 implements system 36 of Fig. 5.
[0026] Memory resource 64 represents generally any number of memory components capable of storing instructions that can be executed by processing resource 66. Memory resource 64 is non-transitory in the sense that it does not encompass a transitory signal but instead is made up of one or more memory components configured to store the relevant instructions. Memory resource 64 may be implemented in a single device or distributed across devices. Likewise, processing resource 66 represents any number of processors capable of executing instructions stored by memory resource 64. Processing resource 58 may be integrated in a single device or distributed across devices. Further, memory resource 64 may be fully or partially integrated in the same device as processing resource 66, or it may be separate but accessible to that device and processing resource 66.
[0027] In one example, the program instructions can be part of an installation package that when installed can be executed by processing resource 66 to implement system 36. In this case, memory resource 64 may be a portable medium such as a CD, DVD, or flash drive or a memory maintained by a server from which the installation package can be downloaded and installed. In another example, the program instructions may be part of an application or applications already installed. Here, memory resource 64 can include integrated memory such as a hard drive, solid state drive, or the like.
[0028] In Fig. 6, the executable program instructions stored in memory resource 64 are depicted as monitor, analysis, association, and instigation modules 68-74 respectively. Monitoring module 68 and analysis module 70 represent program instructions that, when executed, cause processing resource 66 to implement monitor engine 48 and analysis engine 50 respectively. Likewise, association module 72 and instigation module 74 represent program instructions that when executed cause the
implementation of association engine 52 and instigation engine 54 respectively.
[0029] OPERATION: Figs. 7 and 8 are flow diagrams of actions taken to implement an example of an interface overlay method. Fig. 7 covers a first phase in which user action sequences indicative of performance issues are identified and associated with corresponding user action triggers and interface overlay actions. Fig. 8 addresses a second phase in which overlay actions are instigated following detection of associated user action triggers. In discussing Figs. 7 and 8, reference may be made to components depicted in Figs. 1-6. Such reference is made to provide contextual examples and not to limit the manner in which the method depicted by Figs. 7 and 8 may be implemented.
[0030] Starting with Fig. 7, first monitoring data is processed to identify user actions occurring on a plurality of computing devices with respect to a particular application (block 76). Such monitoring data includes information identifying a user's interactions with the application's user interface occurring on each of the plurality of computing devices. The identified user actions can include interactions individual interface controls, interactions with categories or types of controls, and interactions with the interface in general such as mouse movements, keystrokes, touch screen gestures and the like. Referring to Fig. 5, monitor engine 48 may be responsible for block 76. The monitoring data may be collected by monitor engine 48 or by a component independent of system 36.
[0031] From the processing in block 76, a sequence of the user actions is identified (block 78). The identified sequence is one that is repeated with respect to two or more of the plurality of devices and indicates a
performance issue. Examples of such sequences include a threshold number of consecutive selections of a particular interface control or a control type and repetitive selections of a particular control or control type at or above a threshold rate. Referring to Fig. 5, analysis engine 50 may be responsible for implementing block 78.
[0032] A user action trigger is identified and associated with an interface overlay action for the indicated performance issue (block 80). Here, the user action trigger is user action or series of user actions taken from the sequence recognized in block 78. Stated another way, an occurrence of the identified user action trigger is an indicator that the recognized sequence may be or has been repeated. Referring Fig. 5, analysis engine 50 may be responsible for identifying the user action trigger, while association engine52 may be responsible for associating that trigger with the interface overlay action.
[0033] Looking at blocks 78 and 80 together, the sequence of user events identified in block 78 may include a threshold number of selections of a user interface control or control type occurring within a specified timeframe. The trigger identified in block 80 may then be a set number of selections of the user interface control. The sequence of user events identified in block 78 may include a number of user interface control selections from a set of user interface controls occurring within a specified timeframe. Here, the set may encompass a type of interface control. The trigger identified in block 78 may include a specified number of consecutive user interface control selections of from the set of user interface controls as the user action trigger.
[0034] As noted above, the blocks of Fig. 7 depict a first phase in which monitoring data collected from a plurality of computing devices is analyzed to identify a repeated sequence of user actions that is indicative of a negative usage pattern. In other words, a repeated sequence that is indicative of an application performance issue. By identifying trigger for that sequence and associating the trigger with an interface overlay action intended to address the performance issue, the process provides a framework for improving the performance of the application. In particular, the second phase depicted in Fig. 8 utilizes that framework to address a performance issue without requiring action from an application developer. Such provides improved application performance for at least a period of time while users await an application update.
[0035] Moving to Fig. 8, second monitoring data is processed to identify occurrences of a user action trigger on the plurality of computing devices (block 82). The term "second" is used here to indicate that the monitoring data being processed has been collected subsequently to the collection of the "first" monitoring data processed in block 76 of Fig. 7. The trigger, for example, may be the trigger identified in block 80 of Fig. 7. Referring for Fig. 5, monitor engine 48 may be responsible for processing the second monitoring data to identify user actions occurring on the plurality of computing devices with respect to a given application. Analysis engine 50 may then be responsible for analyzing those user actions to identify occurrences of the user action trigger.
[0036] Upon detecting an occurrence of the user action trigger with respect to a given one of the computing devices, the interface overlay action associated with that trigger is instigated for that computing device (block 84). Instigating, here, can include interacting with a server responsible for providing user interface content to that mobile device to cause that server to augment that content with content for causing the display of the
corresponding interface overlay. Instigating may include directly or indirectly interacting with the operating system of the given computing device do cause the display of that interface overlay. Referring to Fig. 5, instigation engine 54 may be responsible for implementing block 84.
[0037] As noted, the sequence identified in step 78 is indicative of a performance issue for an application. Such an issue is typically addressed by the application's developer when releasing subsequent application updates. Beneficially, the process laid out in Figs. 7 and 8 may serve to identify functional issues early and in an automated fashion. Thus, upon identifying sequence that is indicative of a negative usage pattern
(performance issue), the process of Fig. 7 can be extended to include automatically adding an entry to a tracking tool, the entry indicative of the indicated performance issue. Here the tracking tool may be any software based tool that tracks application defects through a defined lifecycle.
Ultimately an entry is closed or otherwise addressed. Such may be the result of an application update or a determination that there is not a problem to address. In either case, once the entry is addressed, the association of the interface overlay action and user action trigger that occurred in block 80 may be reversed. Such may be accomplished by association engine 52 disassociating the trigger and interface overlay action. As a result, an interface overlay action that had been instigated in block 84 based on the detection of the action trigger, will no longer be.
[0038] CONCLUSION: Figs. 1-3 depict example screen views of a user interface intended to help visualize an examples in which interface overlay actions may be instigated. The particular layout of the user interface, its controls and the particular overlays depicted are examples only. Figs. 4-6 aid in depicting the architecture, functionality, and operation of various embodiments. In particular, Figs. 4-6 depict various physical and logical components. Various components are defined at least in part as programs or programming. Each such component, portion thereof, or various combinations thereof may represent in whole or in part a module, segment, or portion of code that comprises one or more executable instructions to implement any specified logical function(s). Each component or various combinations thereof may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
[0039] Embodiments can be realized in any memory resource for use by or in connection with processing resource. A "processing resource" is an instruction execution system such as a computer/processor based system or an ASIC (Application Specific Integrated Circuit) or other system that can fetch or obtain instructions and data from computer-readable media and execute the instructions contained therein. A "memory resource" is any non- transitory storage media that can contain, store, or maintain programs and data for use by or in connection with the instruction execution system. The term "non-transitory is used only to clarify that the term media, as used herein, does not encompass a signal. Thus, the memory resource can comprise any one of many physical media such as, for example, electronic, magnetic, optical, electromagnetic, or semiconductor media. More specific examples of suitable computer-readable media include, but are not limited to, hard drives, solid state drives, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory, flash drives, and portable compact discs.
[0040] Although the flow diagrams of Figs. 7-8 shows a specific orders of execution, the orders of execution may differ from that which is depicted. For example, the order of execution of two or more blocks or arrows may be scrambled relative to the order shown. Also, two or more blocks shown in succession may be executed concurrently or with partial concurrence. All such variations are within the scope of the present invention.
[0041] The present invention has been shown and described with reference to the foregoing exemplary embodiments. It is to be understood, however, that other forms, details and embodiments may be made without departing from the spirit and scope of the invention that is defined in the following claims.

Claims

CLAIMS What is claimed is:
1. An interface overlay method, comprising:
processing first monitoring data to identify user actions occurring on a plurality of computing devices with respect to a particular application;
from the processing, recognizing a sequence of the user actions that is repeated with respect to two or more of the plurality of devices, the sequence indicating a performance issue; and
identifying a user action trigger for the identified sequence and associating the user action trigger with an interface overlay action for the indicated performance issue.
2. The method of Claim 1 , comprising:
processing, with respect to the particular application, second monitoring data to identify occurrences of the user action trigger on the plurality of computing devices; and
upon detecting an occurrence with respect to a given one of the computing devices, instigating the associated interface overlay action for that computing device.
3. The method of Claim 1 , comprising:
following recognition of the series of user events, automatically adding an entry to a tracking tool, the entry indicative of the indicated performance issue;
upon the entry being addressed, automatically disassociating the user action trigger from the interface overlay action for the indicated performance issue.
4. The method of Claim 1 , wherein:
the sequence of user events includes a threshold number of selections of a user interface control occurring within a specified timeframe; identifying comprises identifying a set number of selections of the user interface control as the user action trigger.
5. The method of claim 1 , wherein:
the sequence of user events includes a number of user interface control selections from a set of user interface controls occurring within a specified timeframe;
identifying comprises identifying a specified number of consecutive user interface control selections of from the set of user interface controls as the user action trigger.
6. A memory resource storing instructions that when executed cause a processing resource to implement an interface overlay system, the instructions comprising:
a monitor module that, when executed, causes the processing resource to process monitoring data to identify, for each of a plurality of computing devices, user actions with a particular application;
an analysis module that, when executed, causes the processing resource to analyze the identified user actions for each of the plurality of computing devices to identify occurrences of a user action trigger associated with a performance issue; and
an instigation module that, when executed, causes the processing resource to instigate an interface overlay action associated with the user action trigger for a particular one of the plurality of computing devices device as a result of an identified occurrence of the user action trigger with respect to that computing device.
7. The memory resource of Claim 6, wherein:
the monitoring data is second monitoring data, and the identified user actions are identified second user actions;
the monitoring module, when executed, causes the processing resource to is configured to process first monitoring data to identify first user actions, the first monitoring data collected prior to the second monitoring data;
the analysis module, when executed, causes the processing resource to recognize a sequence of user actions from the identified first user actions that is indicative of a performance issue and that is repeated with respect to two or more of the plurality of computing device; and
the instructions include an association module that, when executed, causes the processing resource to associate the sequence of user actions with the user action trigger and the interface overlay action.
8. The memory resource of Claim 7, wherein the association engine, when executed, causes the processing resource to:
following recognition of the sequence of user actions, initiate the addition of an entry to a tracking tool, the entry indicative of the performance issue; and
following an indication that the entry has been addressed, disassociate the user action trigger from the interface overlay action.
9. The memory resource of Claim 7, wherein the user action trigger comprises a number of consecutive selections of the user interface control.
10. The memory resource of Claim 6, wherein the user action trigger comprises a specified number of consecutive user interface control selections of from a set of user interface control types.
11. An interface overlay system, comprising:
a monitor engine configured to process first monitoring data to identify first user actions and to process second, subsequent monitoring data to identify second user actions, the first and second user actions occurring on a plurality of computing devices with respect to an application; an analysis engine configured to recognize a sequence of user actions from the identified first user actions that is repeated with respect to two or more of the plurality of computing devices and to analyze the identified second user actions to identify occurrences of a user action trigger; an association engine configured to correlate the recognized sequence of user actions with the user action trigger and to associate the user action trigger with an interface overlay action;
an instigation engine configured to instigate the interface overlay action for each of the plurality of computing devices device for which the analysis engine identified an occurrence of the user action trigger.
12. The system of Claim 11 , wherein the analysis engine is configured to identify the user action trigger from the recognized sequence of user actions.
13. The system of Claim 11 , wherein the association engine is configured to:
following recognition of the sequence of user actions by the analysis engine, initiate the addition of an entry to a tracking tool, the entry indicative of the performance issue; and
following an indication that the entry has been addressed,
disassociate the user action trigger from the interface overlay action. .
14. The system of Claim 11 , wherein the recognized sequence of user actions is indicative of a performance issue of the application.
15. The system of Claim 14, wherein the recognized sequence of user actions includes at least one of:
a threshold number of consecutive interactions with a single user interface control;
interactions with an interface control of a specified type exceeding a threshold interaction rate; a threshold number of consecutive interactions with user interface controls selected from a set of user interface control types; and
interactions, exceeding a second threshold interaction rate, with user interface controls selected from a set of user interface control types.
PCT/US2014/066175 2014-11-18 2014-11-18 User interface overlay WO2016080969A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2014/066175 WO2016080969A1 (en) 2014-11-18 2014-11-18 User interface overlay

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2014/066175 WO2016080969A1 (en) 2014-11-18 2014-11-18 User interface overlay

Publications (1)

Publication Number Publication Date
WO2016080969A1 true WO2016080969A1 (en) 2016-05-26

Family

ID=56014326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/066175 WO2016080969A1 (en) 2014-11-18 2014-11-18 User interface overlay

Country Status (1)

Country Link
WO (1) WO2016080969A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1139108A2 (en) * 2000-03-02 2001-10-04 Texas Instruments Incorporated Scan interface with TDM feature for permitting signal overlay
US20060125961A1 (en) * 2002-05-13 2006-06-15 Microsoft Corporation Selectively overlaying a user interface atop a video signal
US20080163090A1 (en) * 2006-12-28 2008-07-03 Yahoo! Inc. Interface overlay
US20140157159A1 (en) * 2012-11-30 2014-06-05 International Business Machines Corporation Collaborative overlay of user interface elements rendered on the display of a computing device
EP2743825A1 (en) * 2012-12-13 2014-06-18 Sap Ag Dynamical and smart positioning of help overlay graphics in a formation of user interface elements

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1139108A2 (en) * 2000-03-02 2001-10-04 Texas Instruments Incorporated Scan interface with TDM feature for permitting signal overlay
US20060125961A1 (en) * 2002-05-13 2006-06-15 Microsoft Corporation Selectively overlaying a user interface atop a video signal
US20080163090A1 (en) * 2006-12-28 2008-07-03 Yahoo! Inc. Interface overlay
US20140157159A1 (en) * 2012-11-30 2014-06-05 International Business Machines Corporation Collaborative overlay of user interface elements rendered on the display of a computing device
EP2743825A1 (en) * 2012-12-13 2014-06-18 Sap Ag Dynamical and smart positioning of help overlay graphics in a formation of user interface elements

Similar Documents

Publication Publication Date Title
JP5803910B2 (en) Region recommendation device, region recommendation method and program
EP2950203B1 (en) Application scenario identification method, power consumption management method and apparatus and terminal device
AU2019200046A1 (en) Utilizing artificial intelligence to test cloud applications
CN106227520B (en) Application interface switching method and device
CN107735766B (en) System and method for proactively providing recommendations to a user of a computing device
TWI528216B (en) Method, electronic device, and user interface for on-demand detecting malware
JP2009506465A5 (en)
CN108399124B (en) Application testing method and device, computer equipment and storage medium
US20130283100A1 (en) Testing device
US8745727B2 (en) Graphical user interface tester
JP2016530660A (en) Context-sensitive gesture classification
Salihu et al. AMOGA: A static-dynamic model generation strategy for mobile apps testing
US20140280262A1 (en) Electronic device with a funiction of applying applications of different operating systems and method thereof
US20170249067A1 (en) User interface feature recommendation
US11544088B1 (en) System and method for providing a customized graphical user interface based on user inputs
US20110066897A1 (en) Methods and apparatuses for determining permanent memory growth
CN103049373A (en) Method and device for positioning of collapse
WO2016044109A1 (en) Code development tool with multi-context intelligent assistance
TWI656453B (en) Detection system and detection method
US11537391B2 (en) Software change analysis and automated remediation
US20140280263A1 (en) Electronic device with a funiction of applying applications of different operating systems and method thereof
Li et al. Combatting energy issues for mobile applications
US20160252974A1 (en) Communicating with an unsupported input device
US9904402B2 (en) Mobile terminal and method for input control
CN108399118B (en) File system test data processing method and device, storage medium and terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14906389

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14906389

Country of ref document: EP

Kind code of ref document: A1