CN104956301A - Display device and method of controlling the same - Google Patents

Display device and method of controlling the same Download PDF

Info

Publication number
CN104956301A
CN104956301A CN201380071613.8A CN201380071613A CN104956301A CN 104956301 A CN104956301 A CN 104956301A CN 201380071613 A CN201380071613 A CN 201380071613A CN 104956301 A CN104956301 A CN 104956301A
Authority
CN
China
Prior art keywords
window
application
display device
order
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380071613.8A
Other languages
Chinese (zh)
Other versions
CN104956301B (en
Inventor
金永振
金刚兑
朴大旭
金泰秀
崔祯桓
金圣熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130022422A external-priority patent/KR102172792B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to CN201910525925.0A priority Critical patent/CN110413191B/en
Priority to CN201711096847.4A priority patent/CN107967087B/en
Priority to CN201910525895.3A priority patent/CN110427130B/en
Priority claimed from PCT/KR2013/011309 external-priority patent/WO2014088375A1/en
Publication of CN104956301A publication Critical patent/CN104956301A/en
Application granted granted Critical
Publication of CN104956301B publication Critical patent/CN104956301B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

A display device with a touch screen, which executes at least one application and a method for controlling the display device are provided. The method includes receiving an application execution command to execute at least one application, determining at least one of a size and a position of a window that executes the at least one application according to a position at which the application execution command is input, and displaying the window according to the at least one of the size and position of the window.

Description

The method of display device and control display device
Technical field
The disclosure relates to a kind of display device and the method for controlling this display device.More specifically, the disclosure relates to a kind of display device of window and the method for controlling this display device that show operation application.
Background technology
Desk-top computer is equipped with at least one display device (such as, monitor).Similarly, the mobile device (such as, mobile phone, smart phone, tablet personal computer (PC) etc.) with touch-screen is also equipped with display device.
User can use desk-top computer to divide the screen (such as, screen flatly or is vertically divided and call multiple window in the screen divided) of display device according to task environment.When runs web browser, user can to roll webpage up or down by means of the upper page-turning button in keyboard or lower page-turning button.If user uses mouse instead of keyboard, then user can select the scroll bar in webpage side to roll up or down webpage by using cursor of mouse.User can also by the top selecting the ceiling button being shown as text or icon in the bottom of webpage to move to webpage.
Compared to desk-top computer, mobile device has little screen size and for limited in the input of screen.Therefore, be difficult to divide screen in a mobile device.
Multiple application can be run in a mobile device.Application comprises the basic application of being installed in the fabrication process by manufacturer and the additional application of selling website download from application.Additional application can be developed by domestic consumer and be registered to application and sell website.Therefore, anyone freely can sell the application of his or she exploitation to mobile subscriber by application sale website.Current, according to product, several ten thousand can be used for mobile device to hundreds of thousands freedom or payment applications.
Summary of the invention
Technical matters
Although provide to mobile device and stimulate user interest and many application of meeting consumers' demand, mobile device is restricted in size of display and user interface (UI) due to their portable size.As a result, inconvenience is felt when user runs multiple application in their mobile device.Such as, when user runs application in a mobile device, apply and be shown throughout the whole of viewing area.If user runs Another Application at the run duration of current application, then user needs first to terminate ongoing application and then selects to run key to run the application wanted.Such as, user need finish rerun and stop each application unhelpful (frustrating) process to run multiple application.But the method running multiple application in a mobile device needs to specify simultaneously.
As mentioned above, stimulate user interest and many application of meeting consumers' demand although provide to mobile device, mobile device is restricted in size of display and UI due to their portable size.As a result, inconvenience is felt when user runs multiple application in their mobile device.
Therefore, the method for exploitation for showing multiple window on individual monitor is needed.In addition, need easily to call multiple window and the method for the arrangement of convenient window after window calls.
More specifically, when showing multiple overlaid windows, need regulation that current display window is switched to the structure of another low priority window.
Information above, by information display as a setting, only understands the disclosure in order to auxiliary.Do not determine or advocate whether above any information can be used as about prior art of the present disclosure.
Technical scheme
Aspect of the present disclosure will at least overcome the above problems and/or shortcoming, and at least provide following advantage.Therefore, one side of the present disclosure by provide a kind of on individual monitor, run multiple window with multiple size and facility from a window to the display device of the switching of another low layer window, and for controlling the method for this display device.
According to one side of the present disclosure, provide a kind of method for controlling display device that there is touch-screen, that run at least one application.Described method comprises: receive the application action command for running at least one application; At least one in the size of the window running at least one application described and position is determined according to the position of application action command input; And carry out display window according at least one in the size of window and position.
According to another aspect of the present disclosure, provide a kind of display device.Described display device comprises and is configured to receive the touch-screen of application action command for running at least one application and controller, and this controller is configured to: determine at least one in the size of the window running at least one application described and position according to the position of application action command input; And control window display on the touchscreen according at least one in the size of window and position.
According to another aspect of the present disclosure, provide a kind of method for running application in the display device comprising touch-screen.Described method comprises: the human window of display application in each in multiple regions of touch-screen; The Show Button at least one border of separating described multiple region; Receive the input of select button; And list of at least one application in this specific region is operated according to display in the specific region of the input received in the middle of described multiple region.
According to another aspect of the present disclosure, provide a kind of method for running application in the display device comprising touch-screen.Described method comprises: the human window of display application in each in multiple regions of touch-screen; The Show Button at least one border of separating described multiple region; At least one application service chart target list is shown in the subregion of touch-screen; From list reception towing application service chart target towing input; The region running new opplication is determined according to the end position of towing input and the position of button; And determined to show in region the human window applied corresponding to application service chart target.
According to another aspect of the present disclosure, provide a kind of display device.Described display device comprises: touch-screen, is configured to the human window of display application in each in multiple region; The Show Button at least one border of separating described multiple region, and the input receiving select button; And controller, be configured to operate in list of at least one application in this specific region according to display in specific region in the middle of described multiple region of the input that receives.
According to another aspect of the present disclosure, provide a kind of display device.Described display device comprises touch-screen and controller, and this touch-screen is configured to: the human window of display application in each in multiple region; The Show Button at least one border of separating described multiple region; At least one application service chart target list is shown in the subregion of touch-screen; And receive towing application service chart target towing input from list; This controller is configured to: determine based on the end position of towing input and the position of button the region running new opplication; And control touch-screen is being determined to show in region the human window applied corresponding to application service chart target.
Other aspects of the present disclosure, advantage and significant feature become clear by from disclosing in the detailed description of various embodiments of the present disclosure of making below in conjunction with accompanying drawing to those skilled in the art.
Accompanying drawing explanation
Below in conjunction with the description of accompanying drawing, the above and other aspects, features and advantages of disclosure specific embodiment will clearly, in accompanying drawing:
Fig. 1 is the block diagram of the display device according to embodiment of the present disclosure;
Fig. 2 a, Fig. 2 b, Fig. 2 c, Fig. 2 d, Fig. 2 e, Fig. 2 f, Fig. 2 g, Fig. 2 h, Fig. 2 i, Fig. 2 j and Fig. 2 k illustrate the window operation method according to disclosure embodiment;
Fig. 3 a, Fig. 3 b, Fig. 3 c, Fig. 3 d, Fig. 3 e, Fig. 3 f, Fig. 3 g, Fig. 3 h and Fig. 3 i illustrate the action stack managed in the display device according to disclosure embodiment;
Fig. 4 a illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Fig. 4 b illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Fig. 5 illustrates the display order (Z order) of the window according to disclosure embodiment;
Fig. 6 a, Fig. 6 b, Fig. 6 c and Fig. 6 d illustrate the application operation method according to disclosure embodiment;
Fig. 7 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Fig. 8 a, Fig. 8 b, Fig. 8 c, Fig. 8 d, Fig. 8 e, Fig. 8 f, Fig. 8 g, Fig. 8 h, Fig. 8 i, Fig. 8 j, Fig. 8 k, Fig. 8 l and Fig. 8 m illustrate the method for showing multiple window according to disclosure embodiment;
Fig. 9 a, Fig. 9 b, Fig. 9 c, Fig. 9 d, Fig. 9 e, Fig. 9 f, Fig. 9 g and Fig. 9 h illustrate the layout according to disclosure embodiment;
Figure 10 a, Figure 10 b, Figure 10 c and Figure 10 d illustrate the screen of the display device according to disclosure embodiment;
Figure 11 a, Figure 11 b and Figure 11 c illustrate the screen of the display device according to disclosure embodiment;
Figure 12 a, Figure 12 b and Figure 12 c illustrate the screen of the display device according to disclosure embodiment;
Figure 13 a, Figure 13 b and Figure 13 c illustrate the screen of the display device according to disclosure embodiment;
Figure 14 a, Figure 14 b and Figure 14 c illustrate the screen of the display device according to disclosure embodiment;
Figure 15 a, Figure 15 b and Figure 15 c illustrate the screen of the display device according to disclosure embodiment;
Figure 16 a, Figure 16 b, Figure 16 c and Figure 16 d illustrate the screen of the display device according to disclosure embodiment;
Figure 17 illustrates the screen of the display device according to embodiment of the present disclosure;
Figure 18 a and Figure 18 b illustrates the 9 region split modes according to disclosure embodiment;
Figure 19 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Figure 20 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Figure 21 a, Figure 21 b and Figure 21 c illustrate the screen of the display device according to disclosure embodiment;
Figure 22 illustrates the action stack according to disclosure embodiment;
Figure 23 a and Figure 23 b illustrates the screen of the display device of the describing Z order modification according to disclosure embodiment;
Figure 24 illustrates the action stack according to disclosure embodiment;
Figure 25 a and Figure 25 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 26 illustrates the action stack according to disclosure embodiment;
Figure 27 a and Figure 27 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 28 illustrates the action stack according to disclosure embodiment;
Figure 29 a and Figure 29 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 30 illustrates the action stack according to disclosure embodiment;
Figure 31 a and Figure 31 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 32 illustrates the action stack according to disclosure embodiment;
Figure 33 a and Figure 33 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 34 illustrates the action stack according to disclosure embodiment;
Figure 35 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device;
Figure 36 a, Figure 36 b and Figure 36 c illustrate the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 37 a, Figure 37 b and Figure 37 c illustrate the action stack according to disclosure embodiment;
Figure 38 a, Figure 38 b and Figure 38 c illustrate the screen of the display device of the describing Z order modification order according to disclosure embodiment;
Figure 39 a, Figure 39 b and Figure 39 c illustrate the action stack according to disclosure embodiment;
Figure 40 a, Figure 40 b, Figure 40 c, Figure 40 d, Figure 40 e, Figure 40 f, Figure 40 g, Figure 40 h, Figure 40 i, Figure 40 j and Figure 40 k illustrate the method for display application human window according to disclosure embodiment;
Figure 41 a, Figure 41 b, Figure 41 c, Figure 41 d, Figure 41 e and Figure 41 f illustrate the action stack according to disclosure various embodiments;
Figure 42 illustrates according to an embodiment of the invention, for running the process flow diagram of the method for application in the display device;
Figure 43 a and Figure 43 b illustrate according to disclosure embodiment, the method for viewing area for using center button to control application human window;
Figure 44 a, Figure 44 b, Figure 44 c, Figure 44 d, Figure 44 e, Figure 44 f, Figure 44 g, Figure 44 h, Figure 44 i, Figure 44 j, Figure 44 k, Figure 44 l, Figure 44 m, Figure 44 n, Figure 44 o, Figure 44 p, Figure 44 q, Figure 44 r, Figure 44 s, Figure 44 t, Figure 44 u, Figure 44 v and Figure 44 w illustrate the method for running multiple application according to disclosure embodiment;
Figure 45 a, Figure 45 b, Figure 45 c, Figure 45 d, Figure 45 e, Figure 45 f, Figure 45 g, Figure 45 h, Figure 45 i and Figure 45 j illustrate the action stack according to disclosure embodiment;
Figure 46 illustrates according to an embodiment of the invention, for providing the method for the user interface running application thereon in the display device;
Figure 47 illustrates according to an embodiment of the invention, for running the process flow diagram of the method for application in the display device;
Figure 48 is the block diagram of the display device according to embodiment of the present disclosure; And
Figure 49 a, Figure 49 b, Figure 49 c and Figure 49 d illustrate the method for the Show Button according to embodiment of the present disclosure.
Throughout accompanying drawing, similar reference marker will be interpreted as with reference to similar part, assembly and structure.
Embodiment
The various embodiments of the present disclosure providing the description referring to accompanying drawing to help complete understanding claim and equivalent thereof to limit.Below describe and comprise various detail and help understand, but these details should be counted as being only exemplary.Therefore, persons of ordinary skill in the art will recognize that and can make various changes and modifications embodiment described herein and the scope of the present disclosure and spirit can not be departed from.In addition, for knowing and for purpose of brevity, the description to known function and structure may being omitted.
The term used in description below and claim and vocabulary are not limited to philogy implication, and inventor uses these data and vocabulary to be only used to realize knowing and consistent understanding the disclosure.Therefore, to it will be clear for those skilled in the art that the following description to disclosure various embodiments only provides for illustrational object, not only in order to the disclosure limited claim and equivalent thereof limits.
To understand, odd number formation " one ", " one ", " being somebody's turn to do " comprise plural reference, unless context is clear make contrary instruction.Thus, such as, when mentioning " assembly surface ", one or more surface is like this contained.
About term " in fact ", its meaning refers to that characteristic, parameter or the value enumerated need not accurately realize, but can comprise such as, the deviation of tolerance, measuring error, measuring accuracy limit and other factors well known by persons skilled in the art or change, do not get rid of the effect of the characteristic wanting to provide in a word.
When such as first, second etc. ordinal number can be used for describing multiple assembly time, these assemblies are not limited to this term.This term is used for an assembly and other assemblies to make a distinction.Such as, within the scope of the present disclosure and spirit, the first assembly can be called as the second assembly, or vice versa.Term "and/or" means one in the combination of the associations comprising multiple description or item.
There is provided technical term used herein to describe various embodiments, and be not intended to limit the disclosure.Herein, odd number is expressed and is comprised plural number expression, unless context clearly shows reverse situation.In this illustrates, term " comprise " or " having " be not interpreted as comprising describe in instructions feature, number, step, operation, assembly, part or its combine whole.But, should be appreciated that the possibility of existence omits or add one or more feature, number, step, operation, assembly, part or its combination.
Unless otherwise defined, otherwise the meaning that has of all terms used herein (comprising technical term and scientific terminology) is identical with the meaning that those skilled in the art's term is generally understood.In addition, the term defined in normal dictionary has the meaning identical with the contextual meaning of prior art by being understood.Unless defined clearly herein, otherwise term can not be interpreted as ideal or excessively formal meaning.
Fig. 1 is the block diagram of the display device according to embodiment of the present disclosure.
With reference to Fig. 1, display device 100 can be connected to external unit (not shown) by mobile communication module 120, sub-communication module 130 or connector 165.Plurality of devices contained in term " external unit ", such as another equipment (not shown), mobile phone (not shown), smart phone (not shown), tablet personal computer (PC) (not shown), server (not shown) etc.
Display device 100 comprises touch-screen 190 and touch screen controller 195.Display device 100 also comprises controller 110, mobile communication module 120, sub-communication module 130, multi-media module 140, camera model 150, GPS (GPS) module 155, I/O (I/O) module 160, sensor assembly 170, storer (reservoir) 175 and power supply 180.Sub-communication module 130 comprises at least one in WLAN (wireless local area network) (WLAN) module 131 and short range communication module 132.Multi-media module 140 comprises at least one in broadcast communication module 141, audio playing module 142 and video playback module 143.Camera model 150 comprises at least one in first camera 151 and second camera 152, and I/O module 160 comprises at least one in button 161, microphone 162, loudspeaker 163, vibrating motor 164, connector 165 and keypad 166.
Controller 110 can comprising CPU (central processing unit) (CPU) 111, storing the random access memory (RAM) 113 of the ROM (read-only memory) (ROM) 112 of the control program for controlling display device 100 and the storage space as the operation performed by display device 100.CPU 111 can comprise one or more core.CPU 111, ROM 112 and RAM 113 can be connected to each other by internal bus.
Controller 110 can control mobile communication module 120, sub-communication module 130, multi-media module 140, camera model 150, GPS module 155, I/O module 160, sensor assembly 170, storer 175, power supply 180, touch-screen 190 and touch screen controller 195.
Display device 100 is connected to external unit by mobile communication via one or more antenna (not shown) by mobile communication module 120 under control of the controller 110.Mobile communication module 120 is to mobile phone (not shown), smart phone (not shown), dull and stereotyped PC (not shown) or have another equipment (not shown) transmission wireless signal of the telephone number being input to display device 100 or receive wireless signal, for audio call, video call, SMS (Short Message Service) (SMS) or multimedia information service (MMS) from the said equipment.
Sub-communication module 130 can comprise at least one in WLAN module 131 and short range communication module 132.
WLAN module 131 is connected to the Internet under can being in the control of controller 110 in the place of having installed wireless aps (not shown).WLAN module 131 supports wlan standard, Institute of Electrical and Electronics Engineers (IEEE) 802.11x.Short range communication module 132 can implement short-distance wireless communication under control of the controller 110 between display device 100 and imaging device (not shown).Junction service can meet bluetooth, Infrared Data Association (IrDA), purple honeybee etc.
Display device 100 can according to its ability comprise in mobile communication module 120, WLAN module 131 and short range communication module 132 at least one.Such as, display device 100 can comprise the combination of mobile communication module 120, WLAN module 131 and short range communication module 132 according to its ability.
Multi-media module 140 can comprise broadcast communication module 141, audio playing module 142 or video playback module 143.Broadcast communication module 141 can under control of the controller 110 by broadcast communication antenna (not shown) from broadcasting station receiving broadcast signal (such as, television broadcasting signal, radio signals or data broadcasting signal) and additional broadcast information (such as, electronic program guides (EPG) or electronic service guidebooks (ESG)).Audio playing module 142 can open the digital audio file (such as, having the file of the extension name of such as mp3, wma, ogg or wav) storing or receive under the control of the control unit 110.Video playback module 143 can open the digital video file (such as, having the file of the extension name of such as mpeg, mpg, mp4, avi, mov or mkv) storing or receive under the control of the control unit 110.Video playback module 143 can also open digital audio file.
Multi-media module 140 can comprise audio playing module 142 and video playback module 143 and not have broadcast communication module 141.Alternatively, the audio playing module 142 of multi-media module 140 or video playback module 143 can be merged in controller 110.
Camera model 150 can comprise at least one in the first camera 151 of capturing still image or video and second camera 152 under the control of the control unit 110.First camera 151 or second camera 152 can comprise the secondary light source (such as, flashlamp (not shown)) of the light intensity being provided for catching image.First camera can be arranged on the front surface of display device 100, and second camera can be arranged on the rear surface of display device 100.Alternatively, first camera 151 and second camera 152 close to each otherly can arrange (distance such as, between first camera 151 and second camera 152 can between 1cm and 8cm) to catch three-dimensional rest image or video.
GPS module 155 can from the multiple gps satellite (not shown) Received signal strength ripples Earth's orbit, and based on from gps satellite to display device 100 the due in (ToAs) of satellite-signal calculate the position of display device 100.
I/O module 160 can comprise at least one in multiple button 161, microphone 162, loudspeaker 163, vibrating motor 164, connector 165 and keypad 166.
Button 161 can be formed in the front surface of the shell of display device 100, side or rear surface, and can comprise start/locking press button (not shown), volume button (not shown), menu button, home button, back and search button.
Microphone 162 receives voice or sound under control of the controller 110 and the voice received or sound is converted to electric signal.
Loudspeaker 163 can export the sound corresponding to the multi-signal received from mobile communication module 120, sub-communication module 130, multi-media module 140 and camera model 150 (such as, wireless signal, broadcast singal, digital audio file, digital video file, photograph taking etc.) to the outside of display device 100.Loudspeaker 163 can export the sound corresponding to the function performed by display device 100 (such as, push-botton operation sound, for ring-back tone of calling out etc.).One or more loudspeaker 163 can be arranged in the correct position of the shell of display device 100 or multiple position.
Vibrating motor 164 can convert the electrical signal under control of the controller 110 as mechanical vibration.Such as, when display device 100 receives incoming call audio call from another mobile device (not shown) in a vibrating mode, vibrating motor 164 operates.One or more vibrating motors 164 can be arranged on the enclosure of display device 100.Vibrating motor 164 can in response on touch-screen 190 user touch and touch-screen 190 on touch continuous motion and operate.
Connector 165 can be used as interface display device 100 being connected to external unit (not shown) or power supply (not shown).Connector 165 can send via the cable external device being connected to connector 165 data be stored in storer 175 under control of the controller 110, or can via cable from outer equipment receiving data.Display device 100 can receive electric power via the cable being connected to connector 165 from power supply or charge to battery (not shown).
Keypad 166 can receive key input to control display device 100 from user.Keypad 166 comprises the physics keypad (not shown) be formed in display device 100, or is presented at the virtual keypad (not shown) on display 190.Physics keypad can not be provided according to the ability of display device 100 or configuration.
Sensor assembly 170 comprises at least one sensor (not shown) of the state for detection display equipment 100.Sensor assembly 170 can comprise and detects user whether close to the motion sensor of the motion (such as, rotary acceleration, vibration etc.) of the proximity sensor of display device 100, the illuminance transducer detecting the amount of the surround lighting around display device 100 or detection display equipment 100.At least one sensor can the state of detection display equipment 100, generates the signal corresponding to the state detected, and sends the signal generated to controller 110.Sensor can be added to sensor assembly 170 according to the ability of equipment 100 or remove from sensor assembly 170.
Storer 175 can store input/output signal or data according to the operation of mobile communication 120, sub-communication module 130, multi-media module 140, camera model 150, GPS module 155, I/O module 160, sensor assembly 170 and touch-screen 190 under the control of the control unit 110.Storer 175 can store program for controlling display device 100 or controller 110 and application.
ROM 112 within the storer 175 be installed in display device 100, controller 110 and RAM 113 or storage card (not shown) (such as, secure digital (SD) card or memory stick) contained in term " storer ".Storer 110 can comprise nonvolatile memory, volatile memory, hard disk drive (HDD) or solid-state drive (SSD).
Power supply 180 can be powered to the one or more battery (not shown) in the shell being contained in display device 100 under control of the controller 110.One or more battery is powered to display device 100.In addition, power supply unit 180 can supply via the cable being connected to connector 165 electric power received from external power source (not shown) to display device 100.
Touch-screen 190 can to user provide corresponding to many services (such as, call out, data transmission, broadcast, photography etc.) user interface (UI).Touch-screen 190 can send to touch screen controller 195 simulating signal touched corresponding at least one on UI.Touch-screen 190 can receive at least one touch inputted by body part (such as, pointing) or the touch input device (such as, writing pencil) of user.Touch-screen 190 can also receive the touch input signal of a continuous action touched in the middle of corresponding to one or more touch.Touch-screen 190 can to touch screen controller 195 send corresponding to input touch simulating signal.
Use in description like this, (namely " touch " can comprise contactless touch, between the part of touch-screen 190 and user or touch input device can detector gap be 1mm or less), and the body part being not limited to touch-screen 190 and user or the contact touched between input tool.The detectable gap of touch-screen 190 can change according to the ability of display device 100 or configuration.
Such as, touch-screen 190 can be implemented as resistive type, capacitive type, infrared type or acoustic-type.
The simulating signal received from touch-screen 190 is converted to digital signal (such as, X and Y coordinates) by touch screen controller 195.Controller 110 can use the Digital Signals touch-screen 190 received from touch screen controller 195.Such as, controller 110 can be presented at selection or the operation of the shortcut icon (not shown) on touch-screen 190 in response to touch control.Touch screen controller 195 can be merged in controller 110.
Fig. 2 a, Fig. 2 b, Fig. 2 c, Fig. 2 d, Fig. 2 e, Fig. 2 f, Fig. 2 g, Fig. 2 h, Fig. 2 i, Fig. 2 j and Fig. 2 k illustrate the window operation method according to disclosure embodiment.Those skilled in the art will easily understand, and display device 200 can be any one in the display device 100 shown in Fig. 1, standard TV (TV), internet TV, medical data display device etc.Therefore, as long as it is equipped with the device for showing the image presented, any equipment can be used as display device.
With reference to Fig. 2 a, display device 200 can define multiple window viewing area 201,202,203 and 204 on the touchscreen.Such as, controller (not shown) can configure first window viewing area 201, Second Window viewing area 202, the 3rd window viewing area 203 and the 4th window viewing area 204.Controller can arrange the first boundary line 211 between first window viewing area 201 and Second Window viewing area 202, the Second Edge boundary line 212 between the 3rd window viewing area 203 and the 4th window viewing area 204, the 3rd boundary line 213 between first window viewing area 201 and the 3rd window viewing area 203 and the 4th boundary line 214 between Second Window viewing area 202 and the 4th window viewing area 204.First boundary line 211 and Second Edge boundary line 212 can be connected to become single line, and the 3rd boundary line 213 and the 4th boundary line 214 can be connected to become single line.Controller configuration the first to the four window viewing area 201,202,203 and 204 does not overlap each other to make them.With reference to 2a, such as, first window viewing area 201 is defined in upper left corner by controller, and Second Window viewing area 202 is defined in upper right corner, 3rd window viewing area 203 is defined in lower-left corner, the 4th window viewing area 204 is defined in bottom right corner.Screen divider is become left side and right-hand part by the first boundary line 211 and Second Edge boundary line 212 by controller, and by the 3rd boundary line 213 and the 4th boundary line 214, screen divider is become the first half and Lower Half.
Controller is at the first boundary line 211 and the Second Edge boundary line 212 point of crossing place display centre button 220 crossing with the 3rd boundary line 213 and the 4th boundary line 214.Center button 220 can be the size of change application viewing area or display device 200 is transformed to the function key that pattern reorientated by window.
Controller controls the display of the window in each in window viewing area 201,202,203 and 204, to run application in the window.Such as, as shown in Fig. 2 b, Fig. 2 c, Fig. 2 d, Fig. 2 e, Fig. 2 f, Fig. 2 g, Fig. 2 h, Fig. 2 i, Fig. 2 j and Fig. 2 k, controller controls the display of the window in each in window viewing area 201,202,203 and 204.
Window can comprise the title of the operation screen of application-specific and the application of operation.The object relevant with application may be displayed on the operation screen of application.Object can use various ways, such as text, figure, icon, button, check box, photo, video, webpage, map etc.When user touches object, the function corresponding to the object that is touched or event can be run in the application.Object can be referred to as view according to operating system (OS).Title block can comprise at least one operating key of the display controlling window.Such as, at least one operating key can comprise window and minimizes button, window maximized button and close button.
Application is the program of being write independently by the manufacturer of display device 200 or application developer.Therefore, the operation of an application does not need the preparation of Another Application to run.Even if at the end of an application, Another Application also can continue to run.
Compared to passing through to an application (such as, Video Applications) add other and apply some available functions (such as, note function, message sending/receiving function etc.) designed by combination function application (or two application), application is configured independently.Be different from existing application, combination function application is the single application being configured to comprise several functions.Therefore, combination function application only provides the limited function of similar existing application, and user will additionally buy the application of this new combination function.
With reference to Fig. 2 b, the display that controller controls first window 230 starts application to run in first window viewing area 201: application L.As shown in figure 2b, application is started: application L shows available application icon 231,232,233,234,235,236,237 and 238.When being received application action command by the touch in application icon 231,232,233,234,235,236,237 and 238, start application: L is in the first to the four viewing area 201 in application, the application corresponding to touching icon is shown in one in 202,203 and 204.
Fig. 3 a, Fig. 3 b, Fig. 3 c, Fig. 3 d, Fig. 3 e, Fig. 3 f, Fig. 3 g, Fig. 3 h and Fig. 3 i illustrate the action stack managed in the display device according to disclosure embodiment.
With reference to Fig. 3 a, controller starts application in response to operate in generation and the management in action stack starting application.
With reference to Fig. 2 c and Fig. 2 d, user 1 can touch the icon 232 representing application B.When representing that the icon 232 of application B is touched, controller controls to show the Second Window 240 wherein having run application B in Second Window viewing area 202.Controller can display window in the first to the four window viewing area 201,202,203 and 204 in order.Such as, controller can control the display of new window with the clockwise order of Second Window viewing area 202, the 3rd window viewing area 203 and the 4th window viewing area 204.Clockwise window display order is an example of the display controlling new window, and therefore, controller can control the display of new window along counter-clockwise order.The order showing new window in window viewing area 201,202,203 and 204 can be changed.
Fig. 3 b illustrates the action stack corresponding to the window shown in figure 2d.Controller generates application B stack 302 in response to operating in action stack of application B.The application B stack 302 finally run is placed on and starts above application stack 301 by controller.This can imply, the Z order (it also can be described to order, grade or priority) of application B is higher than starting application: the Z order of application L.
With reference to Fig. 2 e, user 1 can touch the icon 233 corresponding to application C.
Fig. 3 c illustrates the action stack corresponding to the window shown in Fig. 2 e.Because as shown in figure 2e, user 1 is to startup application: application L input application action command, so notice from Fig. 3 c, starts the Z order of Z order higher than application B of application application L.
With reference to Fig. 2 f, when representing that the icon 233 of application C is touched, controller controls display in the 4th window viewing area 204, that wherein run the 3rd window 250 of application C.
Fig. 3 d illustrates the action stack corresponding to the window illustrated in figure 2f.Controller generates application C stack 303 in response to operating in action stack of application C.The application C stack 303 finally run is placed on and starts above application stack 301 by controller.This can imply that the Z order of application C is higher than starting application: the Z order of application L.
With reference to Fig. 2 g, user 1 can touch the icon 234 representing application D.
Fig. 3 e illustrates the action stack corresponding to the window illustrated in figure 2g.Because as illustrated in figure 2g, user 1 is to startup application: application L inputs application action command, so notice from Fig. 3 e, starts application: the Z order of application L is higher than the Z order applying C.
With reference to Fig. 2 h, when representing that the icon 234 of application D is touched, controller controls display in the 3rd window viewing area 203, that wherein run the 4th window 260 of application D.
Fig. 3 f illustrates the action stack corresponding to the window shown in Fig. 2 h.Controller generates application D stack 304 in response to operating in action stack of application D.The application D stack 304 finally run is placed on and starts above application stack 301 by controller.This can imply that the Z order of application D is higher than startup application, the Z order of application L.
With reference to Fig. 2 i, user 1 can operational applications B.
Fig. 3 g illustrates the action stack corresponding to the window shown in Fig. 2 i.Application B stack 302 is placed on action stack top portion in response to the user's input for application B by controller.
With reference to Fig. 2 j, user 1 can touch the icon 235 representing application E.
Fig. 3 h illustrates the action stack corresponding to Fig. 2 j.Because as shown in Fig. 2 j, user 1 is to startup application: application L inputs application action command, so notice from Fig. 3 h, starts application, and the Z order of application L is higher than the Z order applying D.
When representing that the icon 235 of application E is touched, with reference to Fig. 2 k, controller controls display in the 4th window viewing area 204, that wherein run the 5th window 270 of application E.When there is not empty window viewing area, controller can the action stack shown in reference diagram 3h.Controller can determine the application in action stack with minimum Z order.Such as, controller can determine that the Z order applying C is minimum in the action stack of Fig. 3 h.The display of controller the 5th window 270 of controlling run application E in the 4th window viewing area 204, replaces the window with the application C of minimum Z order.
Fig. 3 i illustrates the action stack corresponding to the window shown in Fig. 2 k.Controller generates application E stack 305 in response to operating in action stack of application E.The application E stack 305 finally run is placed on and starts above application stack 301 by controller.This can imply that the Z order of application E is higher than startup application, the Z order of application L.
Fig. 4 a illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device.
With reference to Fig. 4 a, in operation S401, display device can run multiple application.Such as, display device can run application in response to the application action command being touched triggering by the user on the icon representing application.Display device, especially, the window manager of display device, can generate the window wherein running application.
Display device can determine the layout of arranging window.Layout definition's window can be arranged in window viewing area wherein.Such as, two kinds of patterns can be used for layout, that is, split mode and free style pattern.
In split mode, screen is divided in such a way: multiple window is not shown overlappingly.Such as, if display device display first window and Second Window, then display can divide the screen of such as touch-screen by the layout arranged, and the screen portions of division is defined as window viewing area.Display device can in each window viewing area display window.Because each window viewing area is screen section (segment), display device can show multiple window and not have overlap.
Display device can distribute multiple window to a window viewing area under split mode.Such as, display device can distribute first window and Second Window to first window viewing area.In this case, display device can compare the Z order (order in stack, grade, position) of first window and Second Window.If the Z order of first window is higher than the Z order of Second Window, then display device can show first window in first window viewing area.In this case, although Second Window manages as being arranged in first window viewing area by display device, display device does not show Second Window in first window viewing area.
On the other hand, under free style pattern, the display that multiple window can be superimposed according to their display priority level.Such as, if the viewing area of first window is overlapping with the viewing area of Second Window, then display device can compare the Z order of first window and Second Window.The Z order of window can refer to the display order of window.Such as, if the Z order of first window is higher than the Z order of Second Window, then display device can control to show first window in lap, instead of Second Window.
Under split mode, multiple layout can be used, and such as 2 up/down regions split layout, and 2 left/right regions split layout, and 3 regions split layout, and 4 regions split layout etc.In operation S405, under display device can determine that the layout of window is in split mode or free style pattern.If under layout is in split mode, then display device can determine that layout is that 2 up/down regions split layout, 2 left/right regions split layout, 3 regions fractionation layouts or 4 regions fractionation layouts further.
Once determine the pattern of layout in operation S405, display device can determine the window's position in layout in operation S407.In 2 up/down Regional Distribution situations, display device can be determined at upper window viewing area arrangement first window and the 3rd window and arrange Second Window in lower window viewing area.Alternatively, under free style pattern, display device can determine the coordinates regional of first window and the coordinates regional for Second Window.
Display device can be determined the Z order of multiple application and can show multiple window by the Z order based on application in operation S411 in operation S409.Such as, in 2 up/down region split mode situations, display device can compare the Z order of first window and the 3rd window.In addition, display device can control to show the window with relatively high Z order in corresponding window viewing area.Under free style pattern, display device can compare the Z order of first window and Second Window and can control to have the window of the relatively high Z order of display in overlapping region.
Fig. 4 b illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device.
With reference to Fig. 4 b, in operation S401, display device can run multiple application.Such as, can by representing that the towing gesture of the point be displayed on is triggered application action command by the window that the icon of application is drawn to for applying.The input of towing gesture is an example of application action command, and therefore, application can be run in many ways.Those skilled in the art will easily understand, and the disclosure is not limited to application-specific operation method.
In operation S421, under display device can determine whether current arrangements is in free style pattern.In free style mode arrangements situation, in operation S423, display device can determine the Z order of multiple each window be applied in the window wherein run.In operation S425, display device can according to the Z order display window of window.
Under split mode layout scenarios in operation S421, in operation S431, display device can arrange window in window viewing area.In addition, in operation S433, display device can determine the Z order of the window in each window viewing area.Such as, display device as shown in table 1ly can determine the Z order of window.
[table 1]
Window Window viewing area (page) Z order
A 1 1
B 2 5
C 3 6
D 2 2
E 1 3
F 4 4
As mentioned above, display device can control to show the window A with relatively high Z order in first window viewing area, instead of window E.Display device can control to show the window D with relatively high Z order in Second Window viewing area, instead of window B.In addition, display device can in the 3rd window viewing area display window C and in the 4th window viewing area display window F.Such as, in operation S435, show that aobvious equipment can show the window with the highest Z order in the middle of the window distributing to window viewing area in each window viewing area.
Fig. 5 illustrates the display order (Z order) of the window according to disclosure embodiment.
With reference to Fig. 5, the Z order of screen can be divided into N layer, and n-th layer can be placed on the more top on (N-1) layer.Window may reside in each layer and application may operate in window.Such as, when operation first is applied, the first application operates in the window of ground floor.When operation second is applied, the second application operates in the window of the second layer, and when operation the 3rd is applied, the 3rd application operates in the window of third layer.Therefore, ground floor, the second layer and third layer are layered and create secondaryly.The layer finally created can be the top of layer stack, therefore, may be displayed on the top layer of screen.Such as, multiple window (a) can overlappingly show on the home screen to (d).Such as, first window (a) overlaps on Second Window (b), the 3rd window (c) and the 4th window (d) and shows, Second Window (b) overlaps display on the 3rd window (c) and the 4th window (d), and the 3rd window (c) overlaps display on the 4th window (d).Such as, when the display superimposed to (d) of multiple window (a), the order of display window (a) to (d) is the Z order of window (a) to (d).Z order can be the order along Z axis display window.Layer view (e) can be the screen of the Z order of display window by different level.Z order can be called as display order.
Fig. 6 a, Fig. 6 b, Fig. 6 c and Fig. 6 d illustrate the application operation method according to disclosure embodiment.More specifically, Fig. 6 a, Fig. 6 b, Fig. 6 c and Fig. 6 d illustrate the method for running application under free style mode arrangements.
With reference to Fig. 6 a, Fig. 6 b, Fig. 6 c and Fig. 6 d, display device 600 display window viewing area 620.The pallet 610 holding useful application icon 611,612,613,614,615,616 and 617 is shown to the left side of window viewing area 620 by display device 600.User 10 can operate display device 600 to run the first application A1.Such as, in figure 6b, user 10 can make the icon 611 expression first being applied A1 and be drawn to the towing gesture 625 of first in window viewing area 620.Display device 600 can in response to first the place display first window 630 of towing gesture 625 in window viewing area 620 to run the first application A1 in first window 630.First window 630 can before terminating with default size and shape or the size and shape display that arranges with user 10.
User 10 can operate display device 600 additionally to run the 3rd application A3.Such as, as shown in the figure 6c, user 10 can make the towing gesture 635 that the icon 613 expression the 3rd being applied A3 is drawn to the second point in window viewing area 620.Display device 600 in response to second point place display three window 640 of the action command of input (that is, pulling gesture 635) in window viewing area 620, can apply A3 to run the 3rd in the 3rd window 640.3rd window 640 can before terminating with default size and shape or the size and shape display that arranges with user 10.Because the 3rd window 640 is user 10 has been applied with gesture input last windows to it, so controller (not shown) can apply the higher priority of task grade of A1 to the 3rd application A3 distribution ratio first.Therefore, controller can control the 3rd application A3 be presented on the first application A1.
Fig. 7 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device.
With reference to Fig. 7, in operation S701, display device can show at least one icon representing application.Such as, display device can show the pallet holding at least one icon in the part of touch-screen.
In operation S703, when icon to be drawn to window by when being disposed in first of this place by user, display device can receive the input of towing gesture.Display device can by from icon to the towing gesture identification of first be the order for running the application corresponding to icon.More specifically, in operation S705, display device can determine the position of first that towing gesture has terminated at this place in layout.Such as, if arranged split mode for layout, then display device can determine in layout first corresponding to window area.
In operation S707, display device can according to the position of first in layout determine in the size of window and position at least one.In operation S709, display device can according to the size determined and/or position display window.
Fig. 8 a, Fig. 8 b, Fig. 8 c, Fig. 8 d, Fig. 8 e, Fig. 8 f, Fig. 8 g, Fig. 8 h, Fig. 8 i, Fig. 8 j, Fig. 8 k, Fig. 8 l and Fig. 8 m illustrate the method for showing multiple window according to disclosure embodiment.
With reference to Fig. 8 a, Fig. 8 b, Fig. 8 c, Fig. 8 d, Fig. 8 e, Fig. 8 f, Fig. 8 g, Fig. 8 h, Fig. 8 i, Fig. 8 j, Fig. 8 k, Fig. 8 l and Fig. 8 m, display device 800 display menu screen 817.Menu screen 817 can be the operation screen of start-up routine and can comprise the icon representing application.In addition, menu screen 817 can comprise the information about current time and may further include widget.The pallet 810 holding available icons 811,812,813,814,815 and 816 is shown to the left side of touch-screen by display device 800.
As shown in figure 8b, user 10 can operate display device 800 to run the first application A.Such as, as shown in fig. 8 c, user 10 can touch expression first and applies the icon 811 of A and touch icon 811 is drawn to menu screen 817.Controller (not shown) can control the display of the icon 811 at towpoint place.Controller can control the display of the ghost image view (ghostview) 818 at towpoint place further.Ghost image view 818 refers to the preview of the size and shape of window that the first application A will run wherein, to make user 10 can selection window position.Because do not have window to be shown, so controller can full screen display ghost image view 818.As described below, controller can control the display of full screen ghost image view when there is not arbitrary window and showing on the touchscreen.If single window has shown on the touchscreen, then controller can with the size and shape of the half corresponding to touch-screen display ghost image view.If two windows have shown on the touchscreen, then controller can with the size and shape of the half corresponding in the window of two on touch-screen display ghost image view.If three windows have shown on the touchscreen, then controller can with the size and shape of the half corresponding to maximum in three windows display ghost image view.
Above-mentioned towing gesture identification can be the order for running new opplication by controller.Controller can generate the first window 819 for running the first application A.As shown in Fig. 8 d, controller can control the display of first window 819 with full screen.
User 10 can operate display device 800 additionally to run the second application B.Such as, as shown in Fig. 8 e, user can touch the icon 812 that B is applied in expression second, and as shown in Fig. 8 f, the Lower Half towing to first window 819 touches icon 812.Controller can control the display of icon 812 at towed some place.In addition, controller can control the display of ghost image view 823 at towpoint place.As described before, because single window 819 shows on the touchscreen, so controller can control with the size and shape of the half corresponding to touch-screen display ghost image view 823.Although not shown, if user 10 to be touched icon 812 to the first half towing of touch-screen, then controller controls the display of the ghost image view 823 of the first half at touch-screen.In the example that the Lower Half display ghost image view of touch-screen is only display ghost image view, therefore, touch-screen can be divided into left side and right-hand part and can control to show ghost image view in the left side and right-hand part of touch-screen by controller.
If user terminates towing in the Lower Half of touch-screen as shown in Fig. 8 f, then controller is determined to receive new application action command.As shown in Fig. 8 g, consistent with the ghost image view 823 shown in Fig. 8 f, controller controls to show Second Window 830 in the Lower Half of touch-screen.In addition, the size and shape of first window 819 is narrowed down to first window 820 and may be displayed in the first half of touch-screen to make first window 820 by controller.Controller generates and boundary display centre button 825 between first window 820 and Second Window 830.
User 10 can operate display device 800 additionally to run the 3rd application C.Such as, as shown in Fig. 8 h, user can touch the icon 813 that C is applied in expression the 3rd, and as shown in Fig. 8 i, the right part towing to first window 820 touches icon 813.Controller can control icon 813 and be presented at towed some place.In addition, controller can control ghost image view 827 and be presented at towed some place.As described above, because two windows 820 and 830 show on the touchscreen, so controller can control the display of ghost image view 827 with the size and shape of the half corresponding to first window 820.Although not shown, if user 10 to be touched icon 813 to the left part towing of first window 820, then controller controls to show ghost image view 827 in the left side of the first screen 820.In the example that the right-hand part display ghost image view 827 of first window 820 is only display ghost image view, therefore, first window 820 can be divided into the first half and Lower Half and can control to show ghost image view 827 in the first half and Lower Half of the first screen 820 by controller.Another example that ghost image view 827 is display ghost image views is shown in the half of first window 820.Controller can be determined the size and shape of ghost image view 827 relative to center button 825 and show ghost image view 827 accordingly.
If user terminates towing in the right part of first window 820 as shown in Fig. 8 i, then controller is determined to receive new application action command.As shown in Fig. 8 j, consistent with the ghost image view 827 shown in Fig. 8 i, controller controls to show the 3rd window 840 in the right-hand part of the first screen 820.Alternatively, with the position consistency of center button 825, controller can control display the 3rd window 840.Therefore, because select more to apply operation, so the part of screen can little by little be divided again so that the respective part of the application distribution screen to each operation.
In addition, the size and shape of first window 820 is as one man reduced in the establishment of controller and the 3rd window 840.Such as, first window 820 is shown in the region that controller can control except the viewing area except the 3rd window 840.
User 10 can operate display device 800 additionally to run the 4th application D.Such as, as shown in Fig. 8 k, user can touch the icon 814 that D is applied in expression the 4th, and as shown in Fig. 8 l, the right part towing to Second Window 830 touches icon 814.Controller can control icon 814 and be presented at towed some place.In addition, controller can control ghost image view 831 and be presented at towed some place.As described above, because three windows 820,830 and 840 show on the touchscreen, so controller can control the display of ghost image view 831 with the size and shape of the half corresponding to Second Window 830.Although not shown, if user 10 to be touched icon 814 to the left part towing of Second Window 830, then controller controls the display of the ghost image view 831 in the left side of the second screen 830.In the example that the right-hand part display ghost image view 831 of Second Window is only display ghost image view, therefore, Second Window 830 can be divided into the first half and Lower Half and can control to show ghost image view 831 in the first half and Lower Half of the second screen 830 by controller.Another example that ghost image view 831 is display ghost image views is shown in the half of Second Window 830.Controller can be determined the size and shape of ghost image view 831 relative to center button 825 and show ghost image view 831 accordingly.
If user terminates towing in the right part of Second Window 830 as shown in Fig. 8 l, then controller is determined to receive new application action command.As shown in Fig. 8 j, consistent with the ghost image view 831 shown in Fig. 8 l, controller controls to show the 4th window 850 in the right-hand part of the second screen 830.Alternatively, with the position consistency of center button 825, controller can control display the 4th window 850.
In addition, the size and shape of Second Window 830 is as one man reduced in the establishment of controller and the 4th window 850.
As mentioned above, display device can control display window in the window viewing area of terminating in towing gesture.At Fig. 8 a, Fig. 8 b, Fig. 8 c, Fig. 8 d, Fig. 8 e, Fig. 8 f, Fig. 8 g, Fig. 8 h, Fig. 8 i, Fig. 8 j, Fig. 8 k, Fig. 8 l and Fig. 8 m, at diverse location with formed objects display window.With reference to Fig. 9 a, Fig. 9 b, Fig. 9 c, Fig. 9 d, Fig. 9 e, Fig. 9 f, Fig. 9 g, Fig. 9 h, Figure 10 a, Figure 10 b, Figure 10 c, Figure 10 d, Figure 11 a, Figure 11 b, Figure 11 c, Figure 12 a, Figure 12 b, Figure 12 c, Figure 13 a, Figure 13 b, Figure 13 c, Figure 14 a, Figure 14 b, Figure 14 c, Figure 15 a, Figure 15 b, Figure 15 c, Figure 16 a, Figure 16 b, Figure 16 c, Figure 16 d and Figure 17, will be described below the various embodiments sentencing different size configure window at diverse location.
Fig. 9 a, Fig. 9 b, Fig. 9 c, Fig. 9 d, Fig. 9 e, Fig. 9 f, Fig. 9 g and Fig. 9 h illustrate the layout according to disclosure embodiment.
Fig. 9 a illustrates the full screen layout of the situation for not arranging split mode.In fig. 9 a, display device is throughout whole screen cut and paste first window viewing area 901.
Fig. 9 b illustrates the input field 902 corresponding to first window viewing area 901.
Fig. 9 c illustrates with the screen layout of 2 up/down region split modes.In Fig. 9 c, screen divider can be become upper region and lower area and in upper region and lower area, define first window viewing area 911 and Second Window viewing area 912 respectively by display device.
Fig. 9 d illustrates with the input field of 2 up/down region split modes.First input field 913 can corresponding to first window viewing area 911 and the 3rd input field 915 can corresponding to Second Window viewing area 912.Second input field 914 can corresponding to the border between first window viewing area 911 and Second Window viewing area 912.Such as, when user make pull the towing gesture of icon to the first input field 913 time, display device can in the first window viewing area 911 shown in Fig. 9 c display window.Such as, when user pulls icon to the 3rd input field 915, display device can in the Second Window viewing area 912 shown in Fig. 9 c display window.Such as, when user pulls icon to the second input field 912, display device can throughout the first window viewing area 911 shown in Fig. 9 c and Second Window viewing area 912 whole come display window.
Fig. 9 e illustrates with the screen layout of 2 left/right region split modes.In Fig. 9 e, screen divider can be become left region and right region and in left region and right region, define first window viewing area 921 and Second Window viewing area 922 respectively by display device.
Fig. 9 f illustrates with the input field of 2 left/right region split modes.First input field 923 can corresponding to first window viewing area 921 and the 3rd input field 925 can corresponding to Second Window viewing area 922.Second input field 924 can corresponding to the border between first window viewing area 921 and Second Window viewing area 922.Such as, when user make pull the towing gesture of icon to the first input field 923 time, display device can in the first window viewing area 921 shown in Fig. 9 e display window.Such as, when user pulls icon to the 3rd input field 925, display device can in the Second Window viewing area 922 shown in Fig. 9 e display window.Such as, when user pulls icon to the second input field 924, display device can throughout the first window viewing area 921 shown in Fig. 9 e and Second Window viewing area 922 whole come display window.
Fig. 9 g illustrate according to embodiment of the present disclosure with the layout of 4 region split modes, and Fig. 9 h illustrates the input field of the layout definition according to 4 region split modes shown in Fig. 9 g.
With reference to Fig. 9 g and Fig. 9 h, display device definition the first to the four window viewing area 931,932,933 and 934.Therefore, user can operate display device with human window in any one in the first to the four window viewing area 941,942,943 and 944.Such as, when user to the 3rd input field 932 pull represent application icon time, display device can arrangement and in Second Window viewing area 932 display window.If the boundary of user between the first viewing area 931 and the second viewing area 932 completes towing gesture, then display device can throughout first window viewing area 931 and Second Window viewing area 932 whole come display window.Such as, display device can define the first input field 941 corresponding to first window viewing area 931 and the second input field 942 corresponding to Second Window viewing area 932.Display device can define the 5th input field 945 by the boundary further between first window viewing area 931 and Second Window viewing area 932.Similarly, display device can define the 3rd input field 943 corresponding to the 3rd window viewing area 933 and the 4th window viewing area 934 and the 4th input field 944 respectively.Display device can be defined in the 6th input field 946 of the boundary between first window viewing area and the 3rd window viewing area 933 further, 7th input field 947 of the boundary between Second Window viewing area 932 and the 4th window viewing area 934, and the 8th input field 948 of boundary between the 3rd window viewing area 933 and the 4th window viewing area 934.Display device can be defined in the 9th input field 949 at the point of crossing place joined in the first to the four window viewing area 931,932,933 and 934 further.At the end of towing gesture is in specific input field, display device determines the window viewing area of display window based on the mapping relations shown in table 2.
[table 2]
As mentioned above, display device can define input field for determining the application viewing area that towing gesture terminates wherein.More specifically, display device can to define corresponding to multiple window viewing area between the input field on border or the input field of point of crossing of joining corresponding to multiple window viewing areas.When towing gesture in the input field corresponding to the border between window viewing area at the end of, display device can throughout these window viewing areas whole come display window.At the end of towing gesture is in the input field of the point of crossing of joining corresponding to multiple window viewing areas, display device can throughout these window viewing areas whole come display window.Display device can with different size at diverse location place display window.With reference to Figure 10 a, Figure 10 b, Figure 10 c, Figure 10 d, Figure 11 a, Figure 11 b, Figure 11 c, Figure 12 a, Figure 12 b, Figure 12 c, Figure 13 a, Figure 13 b, Figure 13 c, Figure 14 a, Figure 14 b, Figure 14 c, Figure 15 a, Figure 15 b, Figure 15 c, Figure 16 a, Figure 16 b, Figure 16 c, Figure 16 d and Figure 17, describe in more detail with the above-described configuration of different size at diverse location place display window.More specifically, Figure 10 a, Figure 10 b, Figure 10 c, Figure 10 d, Figure 11 a, Figure 11 b, Figure 11 c, Figure 12 a, Figure 12 b, Figure 12 c, Figure 13 a, Figure 13 b, Figure 13 c, Figure 14 a, Figure 14 b, Figure 14 c, Figure 15 a, Figure 15 b, Figure 15 c, Figure 16 a, Figure 16 b, Figure 16 c, Figure 16 d and Figure 17 illustrate with the layout of 4 region split modes.Therefore, Fig. 9 g and Fig. 9 h is also with reference to following description.
Figure 10 a, Figure 10 b, Figure 10 c and Figure 10 d illustrate the screen of the display device according to disclosure embodiment.
With reference to Figure 10 a, Figure 10 b, Figure 10 c and Figure 10 d, controller controls display window viewing area 1000 and holds the pallet 1010 of the available icons 1011,1012,1013,1014,1015,1016 and 1017 representing application.Controller can show pallet 1010 all the time.Alternatively, controller only can show pallet 1010 when receiving pallet call instruction.The edge that pallet call instruction can receive in response to the left side from touch-screen flicks and generates.Those skilled in the art will easily understand, and the disclosure is not limited to the input type triggering pallet call instruction.Assuming that display device is just showing the first window running application A in window viewing area 1000.
With reference to Figure 10 b, user 10 can make the towing gesture 1021 representing the icon 1016 of application F to the 1: 1027 towing in the Lower Half of window viewing area 1000.In Figure 10 c, controller can determine window viewing area.When 4 Regional Distributions, controller can determine the input field that towing gesture 1021 terminates wherein.Such as, if the 8th input field 948 that 1: 1027 is arranged in Fig. 9 h, then controller can as shown in table 2ly be determined to show F window 1024 throughout the 3rd window viewing area 933 and the whole of the 4th window viewing area 934.Afterwards, controller can show ghost image view 1023 in the region determined.
By viewing ghost image view 1023, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1021, and F window 1024 can show throughout the 3rd window viewing area 933 such as shown in Figure 10 d and the whole of the 4th window viewing area 934.Because F window 1024 is shown, so the size of A window 1000 can be reduced to half and the A window 1000 of display contraction by controller.Controller can with identical level to vertical rate or with new level to the scaled A window 1000 of vertical rate.
Figure 11 a, Figure 11 b and Figure 11 c illustrate the screen of the display device according to disclosure embodiment.More specifically, Figure 11 a, Figure 11 b and Figure 11 c illustrate the subsequent operation of the operation for Figure 10 a, Figure 10 b, Figure 10 c and Figure 10 d.
With reference to Figure 11 a, display device shows A window 1000 and F window 1024 respectively with split mode in the first half of screen and Lower Half.User 10 can operate display device 800 additionally to run application E.By pulling to second point 1033, user 10 can represent that the icon 1015 of application E makes towing gesture 1032.
With reference to Figure 11 b and Figure 11 c, controller can determine the input field corresponding to second point 1033.If controller determination second point 1033 is corresponding to the 8th input field 948 shown in Fig. 9 h, then controller can be determined to show E window 1034 throughout the 3rd window viewing area 933 and the whole of the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1031.
By viewing ghost image view 1031, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1032.E window 1034 can show throughout the 3rd window viewing area 933 and the whole of the 4th window viewing area 934.
Figure 12 a, Figure 12 b and Figure 12 c illustrate the screen of the display device according to disclosure embodiment.More specifically, Figure 12 a, Figure 12 b and Figure 12 c illustrate the subsequent operation of the operation for Figure 11 a, Figure 11 b and Figure 11 c.
With reference to Figure 12 a, display device shows A window 1000 and E window 1034 respectively with split mode in the first half of screen and Lower Half.User 10 can operate display device additionally to run application G.User 10 can make to thirdly 1042 pulling the towing gesture 1041 representing the icon 1017 applying G.
With reference to Figure 12 b and Figure 12 c, controller can determine corresponding to thirdly 1042 input field.If controller determines that then controller can be determined to show G window 1044 throughout the whole of the first to the four window viewing area 931 to 934, as shown in table 2 thirdly 1042 corresponding to the 9th input field 949 shown in Fig. 9 h.Therefore, controller can show the window viewing area determined as ghost image view 1043.
By viewing ghost image view 1043, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1042.G window 1044 can show by full screen, as shown in figure 12 c.
Figure 13 a, Figure 13 b and Figure 13 c illustrate the screen of the display device according to disclosure embodiment.More specifically, Figure 13 a, Figure 13 b and Figure 13 c illustrate the subsequent operation of the operation for Figure 12 a, Figure 12 b and Figure 12 c.
With reference to Figure 13 a, Figure 13 b and Figure 13 c, display device is showing G window 1044.User 10 can make the towing gesture 1051 representing the icon 1012 of application B to the 4th: 1052 towing in the Lower Half of the G window 1044 in Figure 13 b.When controller determine the 4th: 1052 corresponding to the 8th input field 948 shown in Fig. 9 h time, controller can be determined to show E window 1054 throughout the 3rd window viewing area 933 and the whole of the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1053.
By viewing ghost image view 1053, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1052.B window 1054 can show throughout the 3rd window viewing area 933 such as shown in Figure 13 c and the whole of the 4th window viewing area 934.Because B window 1054 is shown, so G window 1044 can be narrowed down to the half of screen and show the G window 1044 of contraction in the first half of screen by controller.
Figure 14 a, Figure 14 b and Figure 14 c illustrate the screen of the display device according to disclosure embodiment.More specifically, Figure 14 a, Figure 14 b and Figure 14 c illustrate the subsequent operation of the operation for Figure 13 a, Figure 13 b and Figure 13 c.
With reference to Figure 14 a, display device shows G window 1044 and B window 1054 respectively with split mode in the first half of screen and Lower Half.User 10 can operate display device additionally to run application G.User 10 can make the towing gesture 1061 of the icon 1013 representing application B to the 5th: 1062 towing.
With reference to Figure 14 b and Figure 14 c, controller can determine the input field corresponding to the 5th: 1062.If controller determine the 5th: 1062 corresponding to the second input field 942 shown in Fig. 9 h time, controller can be determined to show C window 1064 in Second Window viewing area 932, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1063.
By viewing ghost image view 1063, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1061.C window 1064 may be displayed in Second Window viewing area 932, as shown in figure 14 c.
Figure 15 a, Figure 15 b and Figure 15 c illustrate the screen of the display device according to disclosure embodiment.More specifically, Figure 15 a, Figure 15 b and Figure 15 c illustrate the subsequent operation of the operation for Figure 14 a, Figure 14 b and Figure 14 c.
With reference to Figure 15 a, display device is with 3 region split mode display G windows 1044, B window 1054 and C windows 1064.User 10 can operate display device additionally to run application D.User 10 can make the towing gesture 1071 of the icon 1014 representing application D to the 6th: 1072 towing.
With reference to Figure 15 b and Figure 15 c, controller can determine the input field corresponding to the 6th: 1072.If controller determines that the 6th: 1072 corresponding to the 4th input field 944 shown in Fig. 9 h, then controller can be determined to show D window 1074 in the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1073.
By viewing ghost image view 1073, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture 1071.D window 1074 may be displayed in the 4th window viewing area 934, as shown in fig. 15 c.
Figure 16 a, Figure 16 b, Figure 16 c and Figure 16 d illustrate the screen of the display device according to disclosure embodiment.
Figure 16 a illustrates the screen of the display device according to embodiment of the present disclosure.More specifically, Figure 16 a illustrates the subsequent operation of the operation for Figure 15 a, Figure 15 b and Figure 15 c.
With reference to Figure 16 a, display device is with 4 region split mode display G windows 1044, B window 1054, C window 1064 and D windows 1074.User 10 can operate display device additionally to run application H.User 10 can make the towing gesture of the icon 1018 representing application H to the 7th: 1081 towing.
With reference to Figure 16 a, controller can determine the input field corresponding to the 7th: 1081.If controller determines that the 7th: 1081 corresponding to the 5th input field 945 shown in Fig. 9 h, then controller can be determined to show H window 1083 in first window viewing area 931 and Second Window viewing area 932, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1082.Represent that the icon 1015,1016,1017,1018,1019,1020 and 1021 of application E to K can be arranged in pallet 1010.Such as, user 10 can input the upwards towing gesture of leap pallet 1010 to make to represent that the hidden icons 1018,1019,1020 and 1021 of application H to K can be exposed in pallet 1010.
By viewing ghost image view 1082, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture.H window 1083 may be displayed in first window viewing area 931 and Second Window viewing area 932, as shown in figure 16 a.
Figure 16 b illustrates the screen of the display device according to embodiment of the present disclosure.More specifically, Figure 16 b illustrates the subsequent operation of the operation for Figure 15 a, Figure 15 b and Figure 15 c.
With reference to Figure 16 b, display device is with 4 region split mode display G windows 1044, B window 1054, C window 1064 and D windows 1074.User 10 can operate display device additionally to run application H.To the 8th: 1084 towing, user 10 can by representing that the icon 1018 of application H makes towing gesture.
With reference to Figure 16 b, controller can determine the input field corresponding to the 8th: 1084.If controller determines that the 8th: 1084 corresponding to the 6th input field 946 shown in Fig. 9 h, then controller can be determined to show H window 1086 in first window viewing area 931 and the 3rd window viewing area 933, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1085.
By viewing ghost image view 1085, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture.H window 1086 may be displayed in first window viewing area 931 and the 3rd window viewing area 933, as shown in Figure 16 b.
Figure 16 c illustrates the screen of the display device according to embodiment of the present disclosure.More specifically, Figure 16 c illustrates the subsequent operation of the operation for Figure 15 a, Figure 15 b and Figure 15 c.
With reference to Figure 16 c, display device is with 4 region split mode display G windows 1044, B window 1054, C window 1064 and D windows 1074.User 10 can operate display device additionally to run application H.User 10 can make the towing gesture of the icon 1018 representing application H to the 9th: 1087 towing.
With reference to Figure 16 c, controller can determine the input field corresponding to the 9th: 1087.If controller determines that the 9th: 1087 corresponding to the 8th input field 948 shown in Fig. 9 h, then controller can be determined to show H window 1089 in the 3rd window viewing area 933 and the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1088.
By viewing ghost image view 1088, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture.H window 1089 may be displayed in the 3rd window viewing area 933 and the 4th window viewing area 934, as shown in Figure 16 c.
Figure 16 d illustrates the screen of the display device according to embodiment of the present disclosure.More specifically, Figure 16 d illustrates the subsequent operation of the operation for Figure 15 a, Figure 15 b and Figure 15 c.
With reference to Figure 16 d, display device is with 4 region split mode display G windows 1044, B window 1054, C window 1064 and D windows 1074.User 10 can operate display device additionally to run application H.User 10 can make the towing gesture of the icon 1018 representing application H to the 10th: 1090 towing.
With reference to Figure 16 d, controller can determine the input field corresponding to the 10th: 1090.If controller determines that the 10th: 1090 corresponding to the 7th input field 947 shown in Fig. 9 h, then controller can be determined to show H window 1092 in Second Window viewing area 932 and the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1091.
By viewing ghost image view 1091, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture.H window 1092 may be displayed in Second Window viewing area 932 and the 4th window viewing area 934, as shown in Figure 16 d.
Figure 17 illustrates the screen of the display device according to embodiment of the present disclosure.More specifically, Figure 17 illustrates the subsequent operation of the operation for Figure 15 a, Figure 15 b and Figure 15 c.
With reference to Figure 17, display device is with 4 region split mode display G windows 1044, B window 1054, C window 1064 and D windows 1074.User 10 can operate display device additionally to run application H.User 10 can make the towing gesture of the icon 1018 representing application H to the 11: 1093 towing.
With reference to Figure 17, controller can determine the input field corresponding to the 11: 1093.If controller determines that the 11: 1093 corresponding to the 9th input field 949 shown in Fig. 9 h, then controller can be determined to show H window 1095 in the 3rd window viewing area 932 and the 4th window viewing area 934, as shown in table 2.Therefore, controller can show the window viewing area determined as ghost image view 1094.
By viewing ghost image view 1094, user 10 can determine whether window will be presented at the position wanted.User 10 can discharge towing gesture.H window 1095 can full screen display.
As mentioned above, display device can be sentenced different size according to the end point of towing gesture at diverse location and provide window.Although be described above 4 region split modes, above description can expand to 9 region split modes etc.
Figure 18 a and Figure 18 b illustrates the 9 region split modes according to disclosure embodiment.
With reference to Figure 18 a and Figure 18 b, display device can define 9 splitter window viewing areas.In addition, display device can define input field A, C, E, K, M, O, U, W and Y corresponding to respective window viewing area, corresponding to input field B, D, F, H, J, L, N, P, R, T, V and the X on the border between window viewing area, and input field G, I, Q and the S of point of crossing corresponding to window viewing area intersection.When pull the end point of gesture in the input field corresponding to the border between viewing area time, display device can throughout window viewing area whole come display window.When pulling the end point of gesture and being arranged in the input field corresponding to the point of crossing of window viewing area intersection, display device can throughout window viewing area whole come display window.In such a way, display device can sentence different size display window according to the end point of towing gesture at diverse location.
Figure 19 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device.
With reference to Figure 19, in operation S1901, display device can show at least one icon representing application.In operation S1903, display device can receive the towing gesture to first towing icon.In operation S1905, display device can determine the position of the end of pulling gesture in layout.
In operation S1907, display device can determine whether towing gesture terminates the boundary between window viewing area.If the boundary that towing gesture terminates between window viewing area, then in operation S1909 display device can throughout window viewing area whole come display window.
In operation S1911, display device can determine whether pull gesture terminates at the point of crossing place of window viewing area junction.If towing gesture terminate window viewing area join point of crossing place, then operation S1913 in display device can throughout window viewing area whole come display window.
In operation S1915, display device can determine whether towing gesture terminates in window viewing area.If towing gesture terminate in window viewing area, then operation S1917 in display device can in this window viewing area display window.
Figure 20 illustrates according to embodiments of the invention for controlling the process flow diagram of the method for display device.
With reference to Figure 20, in operation S2001, controller can arrange layout with split mode.In operation S2003, controller can define multiple window viewing area according to split mode.In addition, in operation S2005, controller can distribute multiple window to window viewing area.More specifically, controller can distribute multiple window to a window viewing area.
In operation S2007, controller can control display in each window viewing area and distribute to the window with the highest Z order in the middle of the window of window viewing area.Such as, if multiple window is assigned to first window viewing area, then controller can control to show the window with the highest Z order be assigned with in window.
In operation S2009, controller can determine whether to receive the order of Z order modification by touch-screen.The order of Z order modification is the order that request changes the Z order of window.Such as, the order of Z order modification can be triggered by the gesture of flicking on touch-screen.When operate in S2009 receive the order of Z order modification time, at least one Z order can be changed and based on the Z order display window changed at operation S2011 middle controller.
Figure 21 a, Figure 21 b and Figure 21 c illustrate the screen of the display device according to disclosure embodiment.
With reference to Figure 21 a, display device 2100 shows first window 2101 in the first region for operation application A, show Second Window 2102 in the second area for operation application B, in the 3rd region, show the 3rd window 2103 for operation application C, and in the 4th region, show the 4th window 2104 for operation application D.Display device 2100 is display centre button 2110 further.
Figure 22 illustrates the action stack according to disclosure embodiment.
With reference to Figure 22, controller (not shown) can manage the action stack on the left side.Controller can by the Z order of the sequence management window of application C, F, G, A, D, B, J, K, H, M, L and I.Controller for applying the window of C, F and G, divides the window being used in application A, J and L to the 3rd region allocation to first area, to the 4th region allocation for applying the window of D, H and M, and divide the window being used in application B, K and I to second area.
Controller detects and is assumed to operation application in the first region and the Z order comparing the application detected.Controller can determine that applying A in the first region has the highest Z order.Therefore, controller controls the first window 2101 that display application A in the first region runs wherein.Controller detects and is assumed to operation application in the second area and the Z order comparing the application detected.Controller can determine that applying B in the second area has the highest Z order.Therefore, controller controls the Second Window 2102 that display application B in the second area runs wherein.Controller detects and is assumed to the application that operates in the 3rd region and the Z order comparing the application detected.Controller can determine that in the 3rd region, apply C has the highest Z order.Therefore, controller controls the 3rd window 2103 that display application B in the 3rd region runs wherein.Controller detects and is assumed to the application that operates in the 4th region and the Z order comparing the application detected.Controller can determine that in the 4th region, apply D has the highest Z order.Therefore, controller controls the 4th window 2104 that display application B in the 4th region runs wherein.
Continue with reference to Figure 21 a, user 1 can to the 3rd region input Z order modification order.Such as, user 1 can cross over the 3rd region and makes and flick gesture 2120 to the right.This can be flicked gesture 2120 to the right and be identified as the order of Z order modification by controller.Flick the order that gesture 2120 application that can be set to for having minimum Z order in window viewing area distributes the highest Z order to the right.Z order modification reflects in the action stack on the right shown in Figure 22.Notice from the action stack on the right shown in Figure 22, application G is positioned at the top of action stack.Flick the example that gesture 2120 is the Z order distributed in window viewing area to the right, and therefore, those skilled in the art easily will understand, the order of Z order modification can be defined by the gesture made along the direction except to the right.Such as, the order of Z order modification can be defined by various gestures, such as left flick gesture, downward flick gesture, upwards flick gesture etc.In addition, other gestures many except flicking gesture, comprise inclination, towing, shake etc. and can be defined as the order of Z order modification, it can not be understood to limit the disclosure.Herein, right direction can be called as first direction, and first direction is not limited to right direction.
With reference to Figure 21 b and Figure 21 c, controller can control to show the 5th window 2113 in the 3rd region to run application G.When by leap the 3rd region flick to the right gesture receive Z order modification order time, controller can determine that in the 3rd region, apply F has minimum Z order, as shown in Figure 21 c.Controller can control to show the 6th window 2123 in the 3rd region to run application F.
Figure 23 a and Figure 23 b illustrates the screen of the display device of the describing Z order modification according to disclosure embodiment.Figure 24 illustrates the action stack according to disclosure embodiment.Figure 23 a can be substantially identical with Figure 21 c, and the action stack on the left side shown in Figure 24 can describe the Z order of the window be presented in Figure 23 a.
With reference to Figure 23 a, user 1 can input the order of Z order modification by making the gesture 2121 of flicking of crossing over first area to the right to first area.Controller (not shown) can determine that in the region of, apply L has minimum Z order.Controller can distribute the highest Z order to application L, shown in the action stack as shown in Figure 24.
With reference to Figure 23 b, controller can control to show the 7th window 2131 in the first region to run application L.
Figure 25 a and Figure 25 b illustrates the screen of the display device of the describing Z order modification according to disclosure embodiment.Figure 26 illustrates the action stack according to disclosure embodiment.
With reference to Figure 25 a and Figure 25 b, user can input the order of Z order modification by making the gesture 2130 of flicking left of crossing over second area to second area.Gesture identification of flicking left can be the order of Z order modification by controller (not shown).Left flick the order that the gesture application that can be set to for having the highest Z order in window viewing area distributes minimum Z order.Left direction can be called as second direction.
Therefore, the application that controller can have the highest Z order in second area distributes minimum Z order, shown in the action stack as shown in Figure 26.Because application B has been assigned with minimum Z order, so controller can control to distribute the highest Z order to the application K in second area.
Therefore, controller can show the 8th window 2142 in the second area to run application K, as shown in Figure 25 b.The change of result action stack illustrates in fig. 26.
As previously described, gesture identification of flicking left can be the order that application for having minimum Z order in window viewing area distributes the highest Z order by controller.In addition, controller can be the order that application for having the second high Z order in window viewing area distributes minimum Z order by flicking gesture identification to the right.Therefore, user easily can be switched to the screen of the application with minimum or the highest Z order.
Figure 27 a and Figure 27 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment.Figure 28 illustrates the action stack according to disclosure embodiment.
With reference to Figure 27 a and Figure 27 b, display device 2100 shows first window 2101 in the first region to run application A, show Second Window 2102 in the second area to run application B, in the 3rd region, show the 3rd window 2103 to run application C, and in the 4th region, show the 4th window 2104 to run application D.Display device 2100 can manage action stack as shown in Figure 28.User 1 can to the 3rd region input Z order modification order.Such as, when touch the 3rd region such as indicated by reference to label 2701 time, user 1 can make the gesture 2700 that is tilted to the right.Touch and the gesture identification that is tilted to the right can be the order of Z order modification by controller.
Controller can change the Z order in the action stack shown in Figure 28 based on the order of Z order modification, it is describing before with reference to Figure 22, therefore will describe with exceeding.Controller can control to show the 5th window 2113 in the 3rd region, to run application G, as shown in Figure 27 b.
Figure 29 a and Figure 29 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment.
With reference to Figure 29 a and Figure 29 b, display device 2100 shows first window 2101 in the first region to run application A, show Second Window 2102 in the second area to run application B, in the 3rd region, show the 3rd window 2103 to run application C, and in the 4th region, show the 4th window 2104 to run application D.Display device 2100 can management activities stack as shown in Figure 30.User 1 can input the order of Z order modification to second area.Such as, in time such as touching second area with indicating by reference to label 2901, user 1 can make inclination gesture 2900 left.The inclination gesture identification touched with left can be the order of Z order modification by controller.
Controller can change the Z order in the action stack shown in Figure 30 based on the order of Z order modification, it is describing before with reference to Figure 26, therefore will describe with exceeding.Controller can control to show the 8th window 2142 in the second area, to run application K, as shown in figure 29b.
Figure 31 a and Figure 31 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment.Figure 32 illustrates the action stack according to disclosure embodiment.
With reference to Figure 31 a, Figure 31 b and Figure 32, display device 2100 shows first window 2101 in the first region to run application A, show Second Window 2102 in the second area to run application B, in the 3rd region, show the 3rd window 2103 to run application C, and in the 4th region, show the 4th window 2104 to run application D.Display device 2100 can management activities stack as shown in Figure 32.User 1 can to the 3rd region input Z order modification order.Such as, user 1 can touch the point 3100 in the 3rd region and make in the 3rd region and flick gesture 3101 to right hand edge.Gesture identification of flicking to right hand edge can be the order of Z order modification by controller.
Controller can change the Z order in the action stack shown in Figure 32 based on the order of Z order modification, it is describing before with reference to Figure 22, therefore will describe with exceeding.Controller can control to show the 5th window 2113 in the 3rd region, to run application G, as shown in Figure 31 b.
Figure 33 a and Figure 33 b illustrates the screen of the display device of the describing Z order modification order according to disclosure embodiment.Figure 34 illustrates the action stack according to disclosure embodiment.
With reference to Figure 33 a, Figure 33 b and Figure 34, display device 2100 shows first window 2101 in the first region to run application A, show Second Window 2102 in the second area to run application B, in the 3rd region, show the 3rd window 2103 to run application C, and in the 4th region, show the 4th window 2104 to run application D.Display device 2100 can manage action stack as shown in Figure 34.
User 1 can input the order of Z order modification to second area.Such as, user 1 can touch the point 3300 in second area and make edge left in the second area flick gesture 3301.Gesture identification of flicking to left hand edge can be the order of Z order modification by controller.
Controller can change the Z order in the action stack shown in Figure 34 based on the order of Z order modification, it is describing before with reference to Figure 26, therefore will describe with exceeding.Controller can control to show the 8th window 2142 in the second area, to run application K, as shown in Figure 33 b.
Be heretofore described the method changing Z order under split mode.Now, the description of the method for Z order will be changed under being given in free style pattern.
Figure 35 illustrates according to an embodiment of the invention for controlling the process flow diagram of the method for display device.
With reference to Figure 35, the layout be under free style pattern can be set at operation S3501 middle controller.Controller can receive application action command, and therefore, can generate the multiple windows for running multiple application in operation S3503.Controller can be determined the Z order of each in multiple window and can based on their Z order display window in operation S3507 in operation S3505.
In operation S3509, controller can determine whether to have received the order of Z order modification.When receiving the order of Z order modification, the window overlapping according to the Z order display changed can be controlled at operation S3511 middle controller.
The screen of the display device of the describing Z order modification order according to disclosure embodiment is shown for Figure 36 a, Figure 36 b and Figure 36 c and Figure 37 a, Figure 37 b and Figure 37 c illustrate action stack according to disclosure embodiment.
With reference to Figure 36 a, Figure 36 b, Figure 36 c, Figure 37 a, Figure 37 b and Figure 37 c, controller can the action stack shown in control chart 37a.Such as, controller can distribute minimum Z order to the window A running application A, Z order in the middle of distributing to the window B running application B, and distributes the highest Z order to the window C running application C.Therefore, controller can by the order display window of window C 3631 and 3632, window B 3621 and 3622 and window A 3611 and 3612, as shown in Figure 36 a.
User 1 can input the order of Z order modification.When such as reference number 3641 touches title block 3631 with indicating in window C time, the order of Z order modification can be triggered by the exhibition of pinching (pinch-out) gesture 3642 left.The exhibition of pinching can be the gesture of expansion two touch points.In this case, the Z order of window C 3631 and 3632 can reset to minimum Z order.As a result, Z order in the middle of controller distributes to window A, distributes the highest Z order to window B, and distributes minimum Z order to window C.Such as, controller can distribute minimum Z order to window C, the Z order of other windows is increased 1 simultaneously.Therefore, controller can control the order display window by window B 3621 and 3622, window A 3611 and 3612 and window C 3631 and 3632.The exhibition of pinching is the order of Z order modification, and therefore, those skilled in the art are easily with understanding, and the various gestures that can flick by comprising, pulling, edge flicks, touch and tilt, tilt and shake triggers the order of Z order modification.
User 1 can input the order of Z order modification.When such as reference number 3651 touches application operation screen 3622 with indicating in window B time, the order of Z order modification can be triggered by exhibition of pinching gesture 3652 left.In this case, the Z order of window B 3621 and 3622 can reset to minimum Z order.As a result, controller distributes the highest Z order to window A, distributes minimum Z order to the window B running application B, and Z order in the middle of distributing to the window C running application C.Such as, controller can distribute minimum Z order to window B, the Z order of other windows is increased 1 simultaneously.Therefore, controller can control the order display window by window A 3611 and 3612, window C 3631 and 3632 and window B 3621 and 3622.
The screen of the display device of the describing Z order modification order according to disclosure embodiment is shown for Figure 38 a, Figure 38 b and Figure 38 c and Figure 39 a, Figure 39 b and Figure 39 c illustrate action stack according to disclosure embodiment.
With reference to Figure 38 a, Figure 38 b, Figure 38 c, Figure 39 a, Figure 39 b and Figure 39 c, controller can the action stack shown in control chart 39a.Such as, controller can distribute minimum Z order to the window A running application A, Z order in the middle of distributing to the window B running application B, and distributes the highest Z order to the window C running application C.Therefore, controller can by the order display window of window C 3631 and 3632, window B 3621 and 3622 and window A 3611 and 3612, as shown in Figure 38 a.
User 1 can input the order of Z order modification.When such as reference number 3841 indicate, in window C, touch title block 3631 in, the order of Z order modification can by triggering from kneading (pinch-in) gesture 3842 on the left side.Kneading can be the gesture that the distance between two touch points is narrowed.In this case, the Z order with the window A 3631 and 3632 of minimum Z order can be reset as the highest Z order, the Z order of other windows is reduced by 1 simultaneously.Therefore, controller can distribute the highest Z order to window A, distributes minimum Z order to the window B running application B, and Z order in the middle of distributing to the window C running application C, as shown in Figure 39 b.
As shown in Figure 38 b, controller can control the order display window by window A 3611 and 3612, window C 3631 and 3632 and window B 3621 and 3622.User 1 can input the order of Z order modification.When such as reference number 3851 touches title block 3631 with indicating in window C time, the order of Z order modification can be triggered by the kneading gesture 3852 from the left side.In this case, the Z order with the window B 3621 and 3622 of minimum Z order can be reset as the highest Z order, the Z order of other windows is reduced 1 simultaneously.As a result, Z order in the middle of controller can distribute to the window A running application A, distributes the highest Z order to the window B running application B, and distributes minimum Z order, as shown in Figure 39 c to the window C running application C.
Therefore, controller can control the order display window by window B 3621 and 3622, window A 3611 and 3612 and window C 3631 and 3632.
As mentioned above, when receiving the Z order modification order by pinching exhibition gesture, controller can distribute minimum Z order to the window with the highest Z order.On the contrary, when receiving the Z order modification order by kneading gesture, controller can distribute the highest Z order to the window with minimum Z order.Because user can handle the Z order changing window, so the window wanted appears at the top layer of screen by simple.Therefore, convenience for users can be improved.
Figure 40 a, Figure 40 b, Figure 40 c, Figure 40 d, Figure 40 e, Figure 40 f, Figure 40 g, Figure 40 h, Figure 40 i, Figure 40 j and Figure 40 k illustrate the method for display application human window according to disclosure embodiment.
With reference to Figure 40 a, Figure 40 b, Figure 40 c, Figure 40 d, Figure 40 e, Figure 40 f, Figure 40 g, Figure 40 h, Figure 40 i, Figure 40 j and Figure 40 k, display device 4200 can define multiple region 4201,4202,4203 and 4204 on the touchscreen.In order to describe Figure 40 a and follow-up accompanying drawing conveniently, multiple region 4201,4202,4203 and 4204 is called as first area 4201, second area 4202, the 3rd region 4203 and the 4th region 4204 (region 1, region 2, region 3 and region 4) respectively.Both first area 4201 and the 3rd region 4203 form the 5th region (region 5, not shown) and both second area 4202 and the 4th region 4204 form the 6th region (region 6, not shown).Both first area 4201 and second area 4202 form SECTOR-SEVEN territory (region 7, not shown) and both the 3rd region 4203 and the 4th region 4204 form Section Eight territory (region 8, not shown).The first to the four region 4201,4202,4203 and 4204 forming region F.First border 4211 can be arranged between first area 4201 and second area 4202, the second boundary 4212 can be arranged between the 3rd region 4203 and the 4th region 4204,3rd border 4213 can be arranged between first area 4201 and the 3rd region 4203, and the 4th border 4214 can be arranged between second area 4202 and the 4th region 4204.First border 4211 and the second boundary 4212 can form single line, and the 3rd border 4213 and the 4th border 4214 can form single line.The first to the four border 4211 to 4214 not necessarily shows clearly.But the first to the four border 4211 to 4214 can be dotted line.It is not overlapping that controller (not shown) can configure the first to the four region 4201,4202,4203 and 4204, region 4201,4202,4203 and 4204: the first to the four in such a way.Such as, as shown in Figure 40 a, controller in configuration first area, the upper left corner 4201, at upper right corner configuration second area 4202, in configuration the 3rd region 4203, the lower left corner, and can configure the 4th region 4204 in the lower right corner.Screen divider can be become left-right parts by the first border 4211 and the second boundary 4212 by controller, and by the 3rd border 4213 and the 4th border 4214, screen divider is become top and the bottom.
The point of crossing place display centre button 4220 that touch-screen can be joined on the first to the four border 4211 to 4214.Center button 4220 can be the size in the region changing display application human window wherein or arrange the function key of operator scheme of controlling run window.
Controller can control touch-screen in such a way: application human window is presented in each in multiple region.Such as, controller can control touch-screen to make to show the window running application, namely, application human window be presented in each in region 4201,4202,4203 and 4204, as Figure 40 b, 40c, Figure 40 d, Figure 40 e, Figure 40 f, Figure 40 g, Figure 40 h, Figure 40 i, Figure 40 j and as shown in.
The object relevant with application may be displayed on the operation screen of application.Object can use various ways, such as text, figure, icon, button, check box, photo, video, webpage, map etc.When user touches object, may operate in application corresponding to the function of the object that is touched or event.Object can be referred to as view according to OS.Such as, can display capture human window with the size of the catching press-button of the display of controlling run window, the window that runs minimized minimize button, maximize human window size maximize button and terminate human window exit button at least one.
With reference to Figure 40 b, controller can control to show the icon 4231,4232,4233,4234,4235,4236 and 4237 representing the application that can run on the touchscreen.Display device 4200 can run application A.As shown in Figure 40 b, in response to the operation of application A, controller can control the human window 4230 of touch-screen display application A in first area 4201.In addition, controller can control the icon 4231,4232,4233,4234,4235,4236 and 4237 representing the application that can run in the specific location display of touch-screen.When receiving the touch input in icon 4231,4232,4233,4234,4235,4236 and 4237, namely, when receiving the input of icon selecting to represent by the application that runs, display device 4200 can show in the first to the four region 4201,4202,4203 and 4204 corresponding to by the human window of the application of icon selected.
Controller can show the designator 4221 in indicative of active district on the center button 4220 of display.Behaviour area can be the region which show the last application run or the last application selected by user.The application of user operation can be considered to the selected application of user.
Designator 4221 can realize in many ways with the position in indicative of active district.Such as, may be displayed at least partially in the application human window of the behaviour area region overlapping with between center button 4220 of the application human window in behaviour area is presented at.Alternatively, the arrow in the direction in indicative of active district may be displayed on center button 4220.
Behaviour area can be determined based on action stack.The application finally run or last user select the top that application can be positioned at action stack.The region of the human window of the application which show in action stack top portion can be defined as behaviour area by display device 4200.Behaviour area can be called as focus area.Such as, in Figure 40 b, designator 4221 can indicate first area 4201.
Figure 41 a, Figure 41 b, Figure 41 c, Figure 41 d, Figure 41 e and Figure 41 f illustrate the action stack according to disclosure various embodiments.
With reference to Figure 41 a, it illustrates the action stack managed by display device 4200.Controller can generate and the action 4301 of management for applying A in response to operating in action stack of application A.
With reference to Figure 40 c, user 1 can touch the icon 4232 representing application B.When representing that the icon 4232 of application B is touched, controller controls the human window 4240 of display application B in second area 4202, as shown in Figure 40 d.Controller can be determined wherein according to the region of certain order display human window.Such as, controller can control to show new human window by the order in second area 4202, the 3rd region 4203 and the 4th region 4204.Above-mentioned human window display order is only an example, and the order therefore showing new human window in region 4201,4202,4203 and 4204 can change according to various embodiment of the present disclosure.
Because the human window 4240 of application B is presented in second area 4202, so designator 4221 can indicate second area 4202 in Figure 40 d.
With reference to Figure 41 b, it illustrates the action stack corresponding to Figure 40 d.Controller operates in action stack in response to application B's the action 4301 generated for applying B.Controller can place the action 4302 of the last application B run on the action 4301 of application A.
With reference to Figure 40 e, user 1 can touch the icon 4233 corresponding to application C.When representing that the icon 4233 of application C is touched, controller controls the human window 4250 of display application C in the 4th region 4204, as shown in Figure 40 f.Together with the display of the human window 4250 of the application C in the 4th region 4204, designator 4221 can indicate the 4th region 4204.
Figure 41 c illustrates the action stack corresponding to Figure 40 f.Controller operates in action stack in response to application C's the action 4303 generated for applying C.The action 4303 of the last application C run placed by controller at the top of action stack.
With reference to Figure 40 g, user 1 can touch the icon 4234 representing application D.When representing that the icon 4234 of application D is touched, controller controls the human window 4260 of display application D in the 3rd region 4203, as shown in Figure 40 h.Because the human window 4260 of application D is presented in the 3rd region 4203, so the designator 4221 on center button 4220 can indicate the 3rd region 4203.
Figure 41 d illustrates the action stack corresponding to Figure 40 h.Controller operates in action stack in response to application D's the action 4304 generated for applying D.The action 4304 of the last application D run placed by controller on the top of action stack.
With reference to Figure 40 i, user 1 can operational applications B.Figure 41 e illustrates the action stack corresponding to Figure 40 i.Controller inputs in response to the user of the human window 4240 for application B the top action 4302 of application B being moved to action stack.
When receive the user of human window 4240 of application B is inputted time, second area 4202 can be defined as behaviour area by controller, as shown in Figure 40 i.Therefore, the designator 4221 on center button 4220 can indicate second area 4202.
With reference to Figure 40 j, user 1 can touch the icon 4235 representing application E.When representing that the icon 4235 of application E is touched, controller controls the human window 4270 of display application E in the 4th region 4204 on the touchscreen, as shown in Figure 40 k.When there is not dummy section, controller can the action stack shown in reference diagram 41e.Controller can select minimum using action from action stack, and can in the 4th region 4204 display application E, instead of with the corresponding human window 4270 of application C.
Figure 41 f illustrates the action stack corresponding to Figure 40 k.Controller operates in action stack in response to application E's the action 4305 generated for applying E.The action 4305 of the last application E run placed by controller on the top of action stack.
Figure 42 illustrates according to an embodiment of the invention, for running the process flow diagram of the method for application in the display device.
With reference to Figure 42, in operation S4410, display device can run multiple application.Such as, display device can run application in response to the reception of the user's input on the icon representing application.
In operation S4420, display device can determine the layout of the human window arranging application.Layout definition's human window can be arranged in region wherein.Such as, various layout is available, comprises 2 up/down regions and splits layout, and 2 left/right regions split layout, and 3 regions split layout, and 4 regions split layout etc.
In operation S4430, display device can determine the window's position in layout.When 2 the left/right regions defining the 5th region and the 6th region split layout, display device can to the human window of the 5th region allocation web browser and book applications and to the human window of the 6th region allocation video playback application.
In operation S4440, display device can show multiple human window according to the priority level of application.Such as, if the human window of web browser and book applications is assigned to the 5th region, then the human window with the application of higher priority between web browser and book applications may be displayed in the 5th region.
Figure 43 a and Figure 43 b illustrate according to disclosure embodiment, the method for viewing area for using center button to control application human window.
With reference to Figure 43 a, display device 4500 can define first area 4501, second area 4502, the 3rd region 4503, the 4th border 4505, region 4504, first, the second boundary 4507, the 3rd border 4506 and the 4th border 4508, and it can not be interpreted as limiting the disclosure.Therefore, display device 4500 can defined range and border in many ways.
Display device 4500 can at least one border display centre button 4220.Such as, if define the first border 4505, the second boundary 4507, the 3rd border 4506 and the 4th border 4508, then display device 4500 can on the first border 4505, the second boundary 4507, the 3rd border 4506 and the 4th border 4508 point of crossing place display centre button 4220 of joining, as shown in Figure 43 a.In another example, if display device 4500 defines the 5th region and the 6th region (not shown) and the first border 4505 and the second boundary 4507, then display device 4500 can on the first border 4505 or the second boundary 4507 display centre button 4220.
With reference to Figure 43 b, if user 10 touches center button 4220 and pull the center button 4220 be touched, then center button 4220 can be moved to towed position by display device 4500.Along with center button 4220 moves, display device 4500 can change size and the position in the region on wherein display application human window and border.
Figure 44 a, Figure 44 b, Figure 44 c, Figure 44 d, Figure 44 e, Figure 44 f, Figure 44 g, Figure 44 h, Figure 44 i, Figure 44 j, Figure 44 k, Figure 44 l, Figure 44 m, Figure 44 n, Figure 44 o, Figure 44 p, Figure 44 q, Figure 44 r, Figure 44 s, Figure 44 t, Figure 44 u, Fig. 4 v and Figure 44 w illustrate the method for running multiple application according to disclosure embodiment.
With reference to Figure 44 a, Figure 44 b, Figure 44 c, Figure 44 d, Figure 44 e, Figure 44 f, Figure 44 g, Figure 44 h, Figure 44 i, Figure 44 j, Figure 44 k, Figure 44 l, Figure 44 m, Figure 44 n, Figure 44 o, Figure 44 p, Figure 44 q, Figure 44 r, Figure 44 s, Figure 44 t, Figure 44 u, Fig. 4 v and Figure 44 w, in time applying A and run, display device 4600 can show the list 4610 of at least one application.List of application 4610 lists the application that can run.Such as, represent that the icon 4611,4612,4613,4614,4615,4616 and 4617 of the application that can run can be listed in list of application 4610.
Figure 45 a, Figure 45 b, Figure 45 c, Figure 45 d, Figure 45 e, Figure 45 f, Figure 45 g, Figure 45 h, Figure 45 i and Figure 45 j illustrate the action stack according to disclosure embodiment.
With reference to Figure 45 a, it illustrates the action stack corresponding to Figure 44 a.Because the human window of application A is presented at occupy in the region F of whole screen, so controller (not shown) generates the action for applying A, as shown in Figure 45 a.
User 10 can operate display device 4600 additionally to run application B.Such as, as shown in Figure 45 b, user 10 can touch the icon 4612 that B is applied in expression second, and as shown in Figure 45 c, pulls the icon 4612 be touched to the 6th region 4623.
If towing input terminates in the 6th region 4623, then display device 4600 runs corresponding to by the application B of icon 4612 selected.In addition, when when the 6th region 4623 display application B, the application A be presented in the F of region is moved to the 5th region 4619 by display device 4600.
Finally, the human window 4620 of application A is presented in the 5th region 4619, and the human window 4630 applying B is presented in the 6th region 4623.
Display device 4600 can display centre button 4622 on the border between the 5th region 4619 and the 6th region 4623.Display device 4600 can also show the designator 4621 of the human window 4630 of the last application B run of instruction on center button 4622.The region of the human window of application that designator 4621 can will which show on the top being placed on action stack.
Figure 45 b illustrates the action stack corresponding to Figure 44 d.The action being presented at the application A in the F of region is changed application A be may be displayed in the 5th region 4619.Because application B is run, institute is generated and applies B be arranged in the 6th region 4623 for applying the action of B.The action of application B is placed on the top of action stack.
With reference to Figure 44 e, user 10 can the center button 4622 of mobile display.As shown in Figure 43 a and 43b, in company with the movement of center button 4622, the size in the region of the human window which show application can be changed.
Subsequently, as shown in Figure 44 f, user 10 can touch the icon 4613 representing application C, and as shown in Figure 44 g, pulls the icon 4613 be touched to the 4th region 4627.The size in the 4th region 4627 can be determined according to the position of center button 4622.At the end of representing that the towing of the icon 4613 applying C is in the 4th region 4627, display device 4600 can run application C.As shown in Figure 44 h, display device 4600 can in the 4th region 4627 human window 4640 of display application C.Display device 4600 can on center button 4622 display indicator 4621, this designator 4621 indicates the behaviour area of the human window 4640 of display application C.
Figure 45 c illustrates the action stack corresponding to Figure 44 h.Controller (not shown) is in response to the action of operation generation for applying C of application C.Application C is arranged in the 4th region 4627.Because the region of the human window of display application B 4630 is split, so apply B to be assigned to second area in Figure 44 e.
With reference to Figure 44 i, user 10 can control the size in the region of the human window 4620 to 4640 of display application by the input applying Mobility Center button 4622.
As shown in Figure 44 j, user 10 can touch the icon 4614 representing application D, and as shown in Figure 44 k, pulls the icon 4614 be touched to the 3rd region 4631.
If represent that the towing of the icon 4614 be touched of application D terminates in the 3rd region 4631, then display device 4600 can run application D.As shown in Figure 44 l, display device 4600 can in the 3rd region 4631 human window 4650 of display application D.Display device 4600 can on button 4622 display indicator 4621, this designator 4621 indicates the behaviour area of the human window 4650 of display application D.
Figure 45 d illustrates the action stack corresponding to Figure 44 l.Controller is in response to the action of operation generation for applying D of application D.Application D is assigned to the 3rd region 4631.Because the region of the human window of display application A 4620 is split, so application A is assigned to the first area in Figure 44 i.
User 10 can touch the icon 4615 of expression application E and the icon 4615 be touched is drawn to borderline region 4659, as shown in Figure 44 m.Display device 4600 defines borderline region 4659 to comprise border 4685.
If represent that the towing of the icon 4615 be touched of application E terminates in borderline region 4659, then display device 4600 can run application E.The human window 4660 of application E can be arranged in and comprise in the first area on the border 4685 that adjacent edge boundary region 4659 comprises and the SECTOR-SEVEN territory of second area, as shown in Figure 44 n by display device 4600.Display device 4600 can on button 4622 display indicator 4621, this designator 4621 indicates the behaviour area of the human window 4660 of display application E.
Figure 45 e illustrates the action stack corresponding to Figure 44 n.Controller is in response to the action of operation generation for applying E of application E.Application E is assigned to SECTOR-SEVEN territory, and application A and B shown in the first area be included in SECTOR-SEVEN territory and second area is placed in SECTOR-SEVEN territory.
User 10 can touch the icon 4616 of expression application F and the icon 4616 be touched is drawn to this second area 4661, as shown in Figure 44 o.
If represent that the towing of the icon 4616 be touched of application F terminates in second area 4661, then display device 4600 can run application F.As shown in Figure 44 p, display device 4600 can in second area 4661 human window 4670 of display application F.
With reference to Figure 45 f, controller can generate the action for arranging application F in the second area.Along with SECTOR-SEVEN territory is split, be presented at the application A in SECTOR-SEVEN territory, application B and apply E and can arrange in the first region.
Touch-screen can receive the input of the human window 4660 of selective gist E from user 10, as shown in Figure 44 p.
With reference to Figure 45 g, in response to the selection of the human window 4660 of application E, the action of application E can be moved to the top of action stack by controller.Display device 4600 can show the designator 4621 of the position of instruction human window 4660 on button 4622.
With reference to Figure 44 q, display device 4600 can receive the input of the human window 4660 of selective gist E from user 10.Such as, user 10 can touch button 4622.In response to the input receiving select button 4622, display device 4600 can show the list of the application 4611,4612 and 4615 be displayed in this moment movable first area.Such as, display device 4600 can show with reference to the action stack shown in Figure 45 g the icon representing application A, application B and the application E distributing to first area in the first region.
In response to the input receiving select button 4622, display device 4600 can show the display icon 4691,4692 and 4693 of the operation representing relevant to the human window of display application in the first region further.
When receiving the input of icon 4611 in the middle of display icon in the first region, that represent application, display device 4600 can the human window 4620 of display application A in the first region, as shown in Figure 44 s.
Figure 45 h illustrates the action stack corresponding to Figure 44 s.In response to receiving the input selecting the icon 4611 representing application A, the action of application A can be moved to the top of action stack by controller.
With reference to Figure 44 t, when receiving the input selecting center button 4622, the list of the application 4611,4612 and 4615 distributed to as the first area of behaviour area can be shown.In addition, can receive and will represent that the icon 4612 of application B is drawn to the towing input in the region of the human window 4640 of display application C from user 10.Along with towing input completes, display device 4600 can in the 4th region the human window 4630 of display application B, as shown in Figure 44 u.Display device 4600 can show the designator 4621 of the position of the human window 4630 of instruction application B on button 4622.
Figure 45 i illustrates the action stack corresponding to Figure 44 u.Because the human window 4630 of application B is presented in the 4th region, so the area update that application B is assigned to by controller is the 4th region, and the action of application B is moved to the top of action stack.
With reference to Figure 44 v, when receiving the input selecting center button 4622, can show further and representing and the icon 4691,4692 and 4693 be presented at as the relevant operation of the human window of the application in the first area of behaviour area.With regard to the human window of application, the operation relevant to the human window of application can perform additional functions.Such as, represent can to comprise with the icon of the relevant operation of application human window the exit button 4691 terminating human window, maximize button 4692 that full screen shows human window and catch in the catching press-button 4693 of human window at least one, it can not be interpreted as the restriction disclosure.When receiving the input selecting exit button 4691 from user 10, controller can terminate the human window applying A, as shown in (b) of Figure 44 v.
Figure 45 j illustrates the action stack corresponding to Figure 44 v.When stopping the human window of application A, can from the action of action stack removing application A.
With reference to Figure 44 w, when receiving selection maximize button 46921 from user 10 and inputting, display device 4600 can be displayed on the human window 4660 of the application A in behaviour area on the touchscreen with full screen display.
When receiving the input selecting catching press-button 4693 from user 10, display device 4600 can catch movable human window 4660.
Figure 46 illustrates according to an embodiment of the invention, for providing the method for the user interface running application thereon in the display device.
With reference to Figure 46, in operation S4810, display device can at the human window defining display application in multiple regions on the touchscreen.In operation S4820, display device can also the Show Button at least one border between multiple region.
Display device can on button display indicator so that indicative of active district.Behaviour area can refer to the last region selected from multiple region.In addition, behaviour area mean wherein human window be in can by the region in the state of user's input control.
In operation S4830, display device can receive the input of select button.When receiving the input of select button, in operation S4840, display device can the list of display application in a particular area.Herein, specific region can be behaviour area.
List of application can list at least one icon representing at least one application.When receiving the input of at least one in the application selecting to be included in list of application, display device can show in a particular area by the human window of the application selected.When receiving towing and being included in the towing input of the icon in list of application, display device can show the human window of the application corresponding to towed icon in towed region.
Display device can show the icon of the operation relevant to the human window of the application be shown in a particular area further.The icon of the relevant operation of human window represented and apply can comprise catch human window with the maximize button minimizing button, maximize the size of human window of the size of the catching press-button of the display of controlling run window, the window that runs minimized and terminate in the exit button of human window at least one.
Figure 47 illustrates according to an embodiment of the invention, for running the process flow diagram of the method for application in the display device.
With reference to Figure 47, in operation S4910, display device can at the human window defining display application in multiple regions on the touchscreen.In operation S4920, display device can also the Show Button at least one border between multiple region.
In operation S4930, display device can show at least one application service chart target list in the subregion of touch-screen.
In operation S4940, the region of new opplication is determined wherein to run in the position of the position that display device can be drawn to based on application operation icon and button.The operation area of new opplication is wherein by the region of the human window of extra for the display application run.
If towed position drop on comprise at least one border borderline region within, then the operation area of new opplication can be confirmed as comprising the region on contiguous at least one border described.
Subsequently, in operation S4950, display device can in the region determined the human window of display application.
Figure 48 is the block diagram of the display device according to embodiment of the present disclosure.
With reference to Figure 48, display device 5000 can comprise: touch-screen 5010, is configured to the human window of display application in multiple region, the Show Button at least one border between multiple region, and receives the input of select button; And controller 5020, be configured to the list of at least one application being presented at the operation the specific region selected in the middle of multiple region based on the input control touch-screen 5010 received in a particular area.
Specific region comprises can by the behaviour area of user's input control.Behaviour area can be the last region selected in the middle of multiple region.
In addition, controller 5020 can control the designator that touch-screen 5010 shows indicative of active district on button.
Figure 49 a, Figure 49 b, Figure 49 c and Figure 49 d illustrate the method for the Show Button according to embodiment of the present disclosure.
With reference to Figure 49 a, display device 5100 can separate the Show Button 5122 on the border which show the region of the human window of multiple application.In addition, display device 5100 can according to layout definition's arranging line 5120 of application human window.Arranging line 5120 can comprise the profile of dotted line and touch-screen.
Display device 5100 can define arrange regional 5110 further.Arranging line 5120 can be included in arrange regional 5110.
As shown in Figure 49 a, arranging line 5120 and arrange regional 5110 can be determined according to the quantity of the human window of display application on the touchscreen and position.Such as, if layout is 2 up/down regions splits layout, 2 left/right regions split layout, 3 regions fractionation layouts or 4 regions and split layouts, then can according to layout definition's arranging line 5120 and arrange regional 5110.
With reference to Figure 49 b, when receiving input button 5122 being moved to the arrange regional 5110 on touch-screen, button 5122 can move in the part of the immediate arranging line 5120 with the rearmost position of button 5122 by display device 5100.
Display device 5100 can determine the region of the human window of display application wherein based on the position of the button 5122 on arranging line 5120.Therefore, display device 5100 can arrange the viewing area of application human window.
With reference to Figure 49 c, display device 5100 can at the specific location definition arrangement point 5130 of arranging line 5120.When receiving input (such as, two continuous print on button 5122 touch) for button 5122, button 5122 can be moved to arrangement point 5130 by display device 5100.
With reference to Figure 49 d, if button 5122 moves in the mode shown in Figure 49 a, Figure 49 b and Figure 49 c, then movement can be movable according to the function shown in Figure 49 d.Such as, when center button 5122 moves to arranging line 5120 or arrangement point 5130, button 5122 or can arrange point 5130 and turn back to arranging line 5120 or arrangement point 5130 through arranging line 5120.
Easily be switched to the display device of another low priority window from a windows exchange after embodiment of the present disclosure provides and can run multiple window on individual monitor, and for controlling the method for this display device.Therefore, user can use the multiple application in multiple window simultaneously.In addition, if the display that multiple window is superimposed, then the window of current display is easily switched to another low priority window.Therefore, be presented in the environment on screen at multiple window, user can come to run the window expecting size at desired locations place with the convenience improved.
Will be understood that, various embodiments of the present disclosure can realize in hardware, software, or a combination thereof.Software can be stored in the volatibility of such as ROM or nonvolatile semiconductor memory member no matter whether data are erasable maybe can rewrite, be stored in the storer of such as RAM, be stored in storage chip, device, integrated circuit, optically or magnetically can record data and data can from it by machine (such as, computing machine) read such as CD (CD), Digital video disc (DVD) storage medium in, be stored in disk, tape etc.In addition, in the computing machine that embodiment of the present disclosure can be implemented in the storer having controller and be such as suitable for storing program or the multiple program comprising order or portable terminal, for realizing embodiment of the present disclosure.Therefore, the disclosure comprise have for realize the device that defined by claim or method to the program of code and stored program storage medium that can be read by machine.Program can by such as shifting via the wired or wireless medium connecting the signal of communication transmitted to electrically, and its medium and equivalent thereof comprise in the disclosure.
Equipment can provide equipment reception program by wired or wireless connection from program and store this program.Program provides equipment to comprise to comprise the program of the order for realizing embodiment of the present disclosure, for storing the storer of the information being used to embodiment of the present disclosure, for the communication module by wired or wireless connection and mobile device communication, and for automatically or upon request to the controller of mobile device transmission program.
Although illustrate and describe the disclosure in detail with reference to disclosed certain exemplary embodiments, it will be understood to those of skill in the art that and can make various change in form and details to the disclosure and the spirit and scope of the present disclosure of claim and equivalency thereof can not be departed from.

Claims (15)

1., for controlling a method for display device that have touch-screen, that run at least one application, described method comprises:
Receive the application action command for running at least one application described;
At least one in the size of the window running at least one application described and position is determined according to the position of application action command input; And
Display window is carried out according at least one in the size of window and position.
2. the method for claim 1, is also included in before receiving application action command and shows at least one icon representing at least one application described.
3. method as claimed in claim 2, wherein receives application action command and comprises the towing gesture of first receiving and to be drawn to by least one icon described on touch-screen.
4. the method for claim 1, also comprises, by the layout arranged, touch-screen fractionation is become multiple window viewing area,
Wherein display window is included in described multiple window viewing area at least one display window.
5. method as claimed in claim 4, wherein receives application action command and comprises the order received for running multiple application in the Second Window viewing area in described multiple window viewing area.
6. the method for claim 1, wherein display window comprises the Z order of more multiple window and shows the window of multiple overlap according to Z order, and the Z order of wherein said window comprises the display order of each window.
7. method as claimed in claim 6, also comprises the Z order modification order that the request of reception changes the Z order of window.
8. a display device, comprising:
Touch-screen, is configured to the application action command received for running at least one application; And
Controller, be configured to: determine at least one in the size of the window running at least one application described and position according to the position of application action command input, and control window display on the touchscreen according at least one in the size of window and position.
9. the display device as shown in claim 8, wherein said touch-screen display represents at least one icon of at least one application described.
10. display device as claimed in claim 9, at least one icon described is drawn to the towing gesture of first on touch-screen by wherein said touch-screen reception.
11. display devices as shown in claim 8, touch-screen to split by the layout arranged by wherein said controller becomes multiple window viewing area, and controls display window at least one in described multiple window viewing area.
12. display devices as claimed in claim 11, wherein said touch-screen receives the order running multiple application in the Second Window viewing area be used in described multiple window viewing area.
13. display devices as claimed in claim 8, the Z order of the more multiple window of wherein said controller and control the window showing multiple overlap according to Z order, and the Z order of wherein said window comprises the display order of each window.
14. display devices as claimed in claim 13, wherein said touch-screen receives the Z order modification order that request changes the Z order of window.
15. 1 kinds have touch-screen and are adapted to be the display device realized according to any one method in claim 1 to 7.
CN201380071613.8A 2012-12-06 2013-12-06 The method for showing equipment and control display equipment Active CN104956301B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910525925.0A CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201711096847.4A CN107967087B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525895.3A CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
US201261734097P 2012-12-06 2012-12-06
US61/734,097 2012-12-06
US201261737540P 2012-12-14 2012-12-14
US61/737,540 2012-12-14
US201261740887P 2012-12-21 2012-12-21
US61/740,887 2012-12-21
KR20130012019 2013-02-01
KR10-2013-0012019 2013-02-01
KR10-2013-0022422 2013-02-28
KR1020130022422A KR102172792B1 (en) 2012-12-06 2013-02-28 Display apparatus and method for controlling thereof
KR20130099927 2013-08-22
KR10-2013-0099927 2013-08-22
PCT/KR2013/011309 WO2014088375A1 (en) 2012-12-06 2013-12-06 Display device and method of controlling the same

Related Child Applications (3)

Application Number Title Priority Date Filing Date
CN201910525925.0A Division CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201711096847.4A Division CN107967087B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525895.3A Division CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same

Publications (2)

Publication Number Publication Date
CN104956301A true CN104956301A (en) 2015-09-30
CN104956301B CN104956301B (en) 2019-07-12

Family

ID=53054284

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201910525895.3A Active CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201711096847.4A Active CN107967087B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525925.0A Active CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201380071613.8A Active CN104956301B (en) 2012-12-06 2013-12-06 The method for showing equipment and control display equipment

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN201910525895.3A Active CN110427130B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201711096847.4A Active CN107967087B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same
CN201910525925.0A Active CN110413191B (en) 2012-12-06 2013-12-06 Display apparatus and method of controlling the same

Country Status (3)

Country Link
CN (4) CN110427130B (en)
AU (1) AU2013356799B2 (en)
BR (1) BR112015012539B1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511778A (en) * 2015-11-25 2016-04-20 网易(杭州)网络有限公司 Interaction method device for controlling display of multiple game scenes
CN106202909A (en) * 2016-07-06 2016-12-07 沈阳东软医疗系统有限公司 A kind of image processing method and device
CN106403985A (en) * 2016-09-06 2017-02-15 深圳格兰泰克汽车电子有限公司 Vehicle-mounted navigation split-screen display method and device
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
CN106874097A (en) * 2017-02-28 2017-06-20 努比亚技术有限公司 The multi-screen display method and device of a kind of terminal screen
CN107526760A (en) * 2016-06-15 2017-12-29 Sk 普兰尼特有限公司 Interest information analysis method using rolling mode and the equipment using this method
CN109725979A (en) * 2019-01-28 2019-05-07 联想(北京)有限公司 A kind of display control method and electronic equipment
CN110928612A (en) * 2018-09-20 2020-03-27 网易(杭州)网络有限公司 Display control method and device of virtual resources and electronic equipment
CN111090366A (en) * 2017-05-15 2020-05-01 苹果公司 Method for multitasking, storage medium and electronic device
CN113535060A (en) * 2021-07-07 2021-10-22 深圳康佳电子科技有限公司 Screen splitting implementation method and device and storage medium
US11221698B2 (en) 2017-05-15 2022-01-11 Apple Inc. Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
US11789589B2 (en) 2019-02-22 2023-10-17 Sony Group Corporation Information processing apparatus and information processing method for dividing display screen for display of plurality of applications

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109558051B (en) * 2018-11-21 2021-07-20 连尚(新昌)网络科技有限公司 Switching processing method and device of multifunctional page and computer readable storage medium
CN111212261B (en) * 2018-11-22 2021-07-20 浙江宇视科技有限公司 Scene switching method and device
CN110203786A (en) * 2019-06-05 2019-09-06 上海三菱电梯有限公司 A kind of elevator display apparatus and lift facility
CN112289339A (en) * 2020-06-04 2021-01-29 郭亚力 System for converting voice into picture

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US20080204424A1 (en) * 2007-02-22 2008-08-28 Samsung Electronics Co., Ltd. Screen display method for mobile terminal
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
CN102129345A (en) * 2010-01-19 2011-07-20 Lg电子株式会社 Mobile terminal and control method thereof
US20110239156A1 (en) * 2010-03-26 2011-09-29 Acer Incorporated Touch-sensitive electric apparatus and window operation method thereof
US20120169768A1 (en) * 2011-01-04 2012-07-05 Eric Roth Mobile terminal and control method thereof
CN102780932A (en) * 2011-05-13 2012-11-14 上海信颐电子科技有限公司 Multi-window playing method and system
CN103677627A (en) * 2012-09-24 2014-03-26 三星电子株式会社 Method and apparatus for providing multi-window in touch device

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02150919A (en) * 1988-12-01 1990-06-11 Fujitsu Ltd Display system for state display row at the time of dividing and displaying
US6212577B1 (en) * 1993-03-03 2001-04-03 Apple Computer, Inc. Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program
AR029671A1 (en) * 2000-06-12 2003-07-10 Novartis Ag COLOR CONTACT LENS WITH MORE NATURAL APPEARANCE AND METHOD FOR MANUFACTURING IT
US7694233B1 (en) * 2004-04-30 2010-04-06 Apple Inc. User interface presentation of information in reconfigured or overlapping containers
KR20070001771A (en) * 2005-06-29 2007-01-04 정순애 Control method of screen data
US8645853B2 (en) * 2006-11-03 2014-02-04 Business Objects Software Ltd. Displaying visualizations linked to one or more data source queries
CN101606124B (en) * 2007-01-25 2013-02-27 夏普株式会社 Multi-window managing device, program, storage medium, and information processing device
CN101308416B (en) * 2007-05-15 2012-02-01 宏达国际电子股份有限公司 User interface operation method
CN101515227B (en) * 2008-02-20 2011-05-25 联想(北京)有限公司 Window management method and computer
US8229410B2 (en) * 2008-06-30 2012-07-24 Qualcomm Incorporated Methods for supporting multitasking in a mobile device
KR101548958B1 (en) * 2008-09-18 2015-09-01 삼성전자주식회사 A method for operating control in mobile terminal with touch screen and apparatus thereof.
US8302026B2 (en) * 2008-11-28 2012-10-30 Microsoft Corporation Multi-panel user interface
US20100180224A1 (en) * 2009-01-15 2010-07-15 Open Labs Universal music production system with added user functionality
US8627228B2 (en) * 2009-05-24 2014-01-07 International Business Machines Corporation Automatic sash configuration in a GUI environment
US9152299B2 (en) * 2009-10-08 2015-10-06 Red Hat, Inc. Activity management tool
US8208964B2 (en) * 2009-10-30 2012-06-26 Cellco Partnership Flexible home page layout for mobile devices
JP5800501B2 (en) * 2010-03-12 2015-10-28 任天堂株式会社 Display control program, display control apparatus, display control system, and display control method
DE202011110735U1 (en) * 2010-04-06 2015-12-10 Lg Electronics Inc. Mobile terminal
US20120144331A1 (en) * 2010-12-03 2012-06-07 Ari Tolonen Method for Arranging Application Windows on a Display
KR20120095155A (en) * 2011-02-18 2012-08-28 박철 Operation method of personal portable device having touch panel
CN102646010A (en) * 2011-02-22 2012-08-22 中兴通讯股份有限公司 Software switching method and device
CN102736903A (en) * 2011-04-08 2012-10-17 腾讯科技(深圳)有限公司 Method and device for managing widgets based on intelligent terminal desktop
KR101199618B1 (en) * 2011-05-11 2012-11-08 주식회사 케이티테크 Apparatus and Method for Screen Split Displaying
KR101841590B1 (en) * 2011-06-03 2018-03-23 삼성전자 주식회사 Method and apparatus for providing multi-tasking interface
CN102521034B (en) * 2011-12-27 2014-05-07 惠州Tcl移动通信有限公司 Multitask management method and multitask management system based on android system
CN102664747B (en) * 2012-03-27 2015-01-07 易云捷讯科技(北京)有限公司 Cloud calculating platform system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US20080204424A1 (en) * 2007-02-22 2008-08-28 Samsung Electronics Co., Ltd. Screen display method for mobile terminal
US20100248788A1 (en) * 2009-03-25 2010-09-30 Samsung Electronics Co., Ltd. Method of dividing screen areas and mobile terminal employing the same
CN102365617A (en) * 2009-03-25 2012-02-29 三星电子株式会社 Method of dividing screen areas and mobile terminal employing the same
CN102129345A (en) * 2010-01-19 2011-07-20 Lg电子株式会社 Mobile terminal and control method thereof
US20110175930A1 (en) * 2010-01-19 2011-07-21 Hwang Inyong Mobile terminal and control method thereof
US20110239156A1 (en) * 2010-03-26 2011-09-29 Acer Incorporated Touch-sensitive electric apparatus and window operation method thereof
US20120169768A1 (en) * 2011-01-04 2012-07-05 Eric Roth Mobile terminal and control method thereof
CN102780932A (en) * 2011-05-13 2012-11-14 上海信颐电子科技有限公司 Multi-window playing method and system
CN103677627A (en) * 2012-09-24 2014-03-26 三星电子株式会社 Method and apparatus for providing multi-window in touch device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511778A (en) * 2015-11-25 2016-04-20 网易(杭州)网络有限公司 Interaction method device for controlling display of multiple game scenes
CN107526760A (en) * 2016-06-15 2017-12-29 Sk 普兰尼特有限公司 Interest information analysis method using rolling mode and the equipment using this method
US10803506B2 (en) 2016-06-15 2020-10-13 Sk Planet Co., Ltd. Interest information analysis method using scroll pattern and apparatus using the same
CN106202909A (en) * 2016-07-06 2016-12-07 沈阳东软医疗系统有限公司 A kind of image processing method and device
CN106403985A (en) * 2016-09-06 2017-02-15 深圳格兰泰克汽车电子有限公司 Vehicle-mounted navigation split-screen display method and device
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
CN114201133A (en) * 2016-10-31 2022-03-18 北京小米移动软件有限公司 Split screen display method and device
CN106874097A (en) * 2017-02-28 2017-06-20 努比亚技术有限公司 The multi-screen display method and device of a kind of terminal screen
CN111090366A (en) * 2017-05-15 2020-05-01 苹果公司 Method for multitasking, storage medium and electronic device
CN111090366B (en) * 2017-05-15 2021-08-31 苹果公司 Method for multitasking, storage medium and electronic device
US11221698B2 (en) 2017-05-15 2022-01-11 Apple Inc. Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
CN110928612A (en) * 2018-09-20 2020-03-27 网易(杭州)网络有限公司 Display control method and device of virtual resources and electronic equipment
CN110928612B (en) * 2018-09-20 2022-08-19 网易(杭州)网络有限公司 Display control method and device of virtual resources and electronic equipment
CN109725979A (en) * 2019-01-28 2019-05-07 联想(北京)有限公司 A kind of display control method and electronic equipment
US11789589B2 (en) 2019-02-22 2023-10-17 Sony Group Corporation Information processing apparatus and information processing method for dividing display screen for display of plurality of applications
CN113535060A (en) * 2021-07-07 2021-10-22 深圳康佳电子科技有限公司 Screen splitting implementation method and device and storage medium

Also Published As

Publication number Publication date
BR112015012539A2 (en) 2017-07-11
CN110427130A (en) 2019-11-08
AU2013356799A1 (en) 2015-05-14
CN107967087A (en) 2018-04-27
CN107967087B (en) 2021-08-17
BR112015012539A8 (en) 2019-10-01
CN110427130B (en) 2023-07-21
BR112015012539B1 (en) 2022-03-03
CN104956301B (en) 2019-07-12
AU2013356799B2 (en) 2019-08-08
CN110413191B (en) 2022-12-23
CN110413191A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN104956301A (en) Display device and method of controlling the same
US10671282B2 (en) Display device including button configured according to displayed windows and control method therefor
US11635869B2 (en) Display device and method of controlling the same
US11853523B2 (en) Display device and method of indicating an active region in a multi-window display
JP6550515B2 (en) Display apparatus for executing multiple applications and control method thereof
CN105683894B (en) Application execution method of display device and display device thereof
EP2690542B1 (en) Display device and control method thereof
US20230229287A1 (en) Display device and method of controlling the same
CN103853427A (en) Display device for executing a plurality of applications and method for controlling the same
CN104903830A (en) Display device and method of controlling the same
CN103853424A (en) Display device and method of controlling the same
KR102301053B1 (en) Display apparatus and method for controlling thereof
EP2753053B1 (en) Method and apparatus for dynamic display box management
JP2022521720A (en) Mini-program creation method, device, terminal and program
KR20140084966A (en) Display apparatus and method for controlling thereof
KR102360249B1 (en) Display apparatus and method for controlling thereof

Legal Events

Date Code Title Description
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant