1461186402-5c58afa3-7522-4826-bbe4-2a34bf0bfd50

1. A method for detecting interactive inputs, comprising:
concurrently capturing touch input data on a screen of a user device and non-touch gesture input data, the gesture input data being indicative of a gesture performed in an area laterally offset from the screen of the user device;
determining an input command based at least in part on a combination of the concurrently captured touch input data and the non-touch gesture input data; and
affecting an operation of the user device based on the determined input command.
2. The method of claim 1, wherein the touch input data identifies a target item on the screen of the user device, and wherein the input command comprises an adjustment of an aspect of the target item based on the gesture input data.
3. The method of claim 2, wherein the input command comprises a variable adjustment of the target item, the variable adjustment being determined from the gesture input data.
4. The method of claim 1, wherein the capturing the touch input data further comprises receiving a touch, from a user on the screen of the user device, on a desired target item to be affected; and
determining that the target item has been disengaged by detecting a release of the touch on the screen of the user device.
5. The method of claim 1, wherein the capturing the non-touch, gesture input data comprises detecting a location and a movement of an object.
6. The method of claim 5, wherein the movement of the object further comprises a movement substantially in a same plane as the screen of the user device.
7. The method of claim 1, wherein the capturing the non-touch, gesture input data comprises using one or more sensors adapted to detect an object beyond a surface of the user device via ultrasonic technologies, image or video capturing technologies, or IR technologies.
8. The method of claim 1, wherein the determined input command comprises one of a plurality of different types of clicks.
9. The method of claim 8, wherein the different types of clicks further comprise a right-mouse click (RMC) created by a first pose of a hand touching the screen, or a left-mouse click (LMC) or an alternate click created by a second pose of the hand touching the screen, the first pose being different than the second pose.
10. The method of claim 1, wherein the capturing the non-touch, gesture input data comprises capturing a hand pose or a hand motion.
11. A system comprising:
a display configured to display one or more images;
one or more sensors configured to detect touch input data at the display;
one or more sensors configured to detect non-touch gesture input data;
indicative of a gesture performed in an area laterally offset from the display; and
one or more processors configured to:
concurrently capture the touch input data and the non-touch, gesture input data;
determine an input command based at least in part on a combination of the concurrently captured touch input data and the non-touch gesture input data; and
affect an operation of the system based on the determined input command.
12. The system of claim 11, wherein the touch input data identifies a target item on the display, and wherein the input command comprises an adjustment of an aspect of the target item based on the gesture input data.
13. The system of claim 12, wherein the input command comprises a variable adjustment of the target item, the variable adjustment being determined from the gesture input data.
14. The system of claim 11, wherein the processor is further configured to receive a touch from a user on the display on a desired target item and determine that the target item has been disengaged by detecting a release of the touch on the display.
15. The system of claim 11, wherein the one or more sensors configured to detect non-touch gesture input data are further configured to capture a location and a movement of an object.
16. The system of claim 15, wherein the movement of the object further comprises a movement substantially in a same plane as the display.
17. The system of claim 16, wherein the one or more sensors configured to detect non-touch gesture input data comprise ultrasonic sensors, image or video capturing sensors, or IR sensors adapted to capture the non-touch gesture input data.
18. The system of claim 11, wherein the determined input command comprises one of a plurality of different types of clicks.
19. The system of claim 18, wherein the different types of clicks further comprise a right-mouse click (RMC) created by a first pose of a hand touching the display, or a left-mouse click (LMC) or an alternate click created by a second pose of the hand touching the display, the first pose being different than the second pose.
20. The system of claim 11, wherein the one or more sensors configured to detect non-touch gesture input data are configured to capture a hand pose or a hand motion.
21. An apparatus for detecting interactive inputs, comprising:
means for concurrently capturing touch input data on a screen of a user device and non-touch gesture input data, the gesture input data being indicative of a gesture performed in an area laterally offset from the screen of the user device;
means for determining an input command based at least in part on a combination of the concurrently captured touch input data and the non-touch gesture input data; and
means for affecting an operation of the user device based on the determined input command.
22. The apparatus of claim 21, wherein the touch input data identifies a target item on the screen of the user device, and wherein the input command comprises an adjustment of an aspect of the target item based on the gesture input data.
23. The apparatus of claim 22, wherein the input command comprises a variable adjustment of the target item, the variable adjustment being determined from the gesture input data.
24. The apparatus of claim 21, wherein the means for concurrently capturing the touch input data comprises means for receiving a touch, from a user on the screen of the user device, on a desired target item to be affected; and
wherein the apparatus further comprises determining that the target item has been disengaged by detecting a release of the touch on the screen of the user device.
25. The apparatus of claim 21, wherein the means for concurrently capturing the non-touch, gesture input data comprises means for detecting a location and a movement of an object.
26. The apparatus of claim 25, wherein the movement of the object further comprises a movement substantially in a same plane as the screen of the user device.
27. The apparatus of claim 21, wherein the means for concurrently capturing the non-touch, gesture input data comprises one or more sensors adapted to detect an object beyond a surface of the apparatus via ultrasonic technologies, image or video capturing technologies, or IR technologies.
28. The apparatus of claim 21, wherein the determined input command comprises one of a plurality of different types of clicks.
29. The apparatus of claim 28, wherein the different types of clicks further comprise a right-mouse click (RMC) created by a first pose of a hand touching the screen, or a left-mouse click (LMC) or an alternate click created by a second pose of the hand touching the screen, the first pose being different than the second pose.
30. The apparatus of claim 21, wherein the means for concurrently capturing the non-touch, gesture input data comprises means for capturing a hand pose or a hand motion.
31. A non-transitory computer readable medium on which are stored computer readable instructions which, when executed by a processor, cause the processor to:
concurrently capture touch input data on a screen of a user device and non-touch gesture input data, the gesture input data being indicative of a gesture performed in an area laterally offset from the screen of the user device;
determine an input command based at least in part on a combination of the concurrently captured touch input data and the non-touch gesture input data; and
affect an operation of the user device based on the determined input command.

The claims below are in addition to those above.
All refrences to claim(s) which appear below refer to the numbering after this setence.

1. An organic electro-luminescent device, comprising:
a substrate;
two electrodes disposed on the substrate, one serving as an anode and the other as a cathode; and
an organic electro-luminescent structure interposed between the electrodes, comprising:
a fluorescent emissive layer;
a phosphorescent emissive layer having a host material; and
a nondoped organic material layer interposed between the fluorescent emissive layer and the phosphorescent emissive layer, having a highest occupied molecular orbital energy level lower than that of the host material in the phosphorescent emissive layer.
2. The device as claimed in claim 1, further comprising:
a hole injection layer disposed on the anode;
a hole transport layer interposed between the hole injection layer and the organic electro-luminescent structure; and
an electron transport layer interposed between the cathode and the organic electro-luminescent structure.
3. The device as claimed in claim 1, wherein the nondoped organic material layer has a thickness not exceeding 100 \u212b.
4. The device as claimed in claim 1, wherein the fluorescent emissive layer comprises a blue fluorescent material.
5. The device as claimed in claim 1, wherein the phosphorescent emissive layer comprises green and red phosphorescent materials.
6. The device as claimed in claim 1, wherein the host material in the phosphorescent emissive layer comprises carbazole biphenyl.
7. An organic electro-luminescent device, comprising:
a substrate;
an anode disposed on the substrate;
a fluorescent emissive layer comprising a blue fluorescent material disposed on the anode;
a nondoped organic material layer disposed on the blue fluorescent material layer;
a phosphorescent emissive layer comprising green and red phosphorescent materials disposed on the nondoped organic material layer, using carbazole biphenyl as a host material; and
a cathode disposed on the phosphorescent emissive layer;
wherein the nondoped organic material layer has a highest occupied molecular orbital enemy level lower than that of the host material in the phosphorescent emissive layer.
8. The device as claimed in claim 7, wherein the nondoped organic material layer has a thickness not exceeding 100 \u212b.
9. The device as claimed in claim 7, further comprising:
a hole injection layer interposed between the anode and the fluorescent emissive layer;
a hole transport layer interposed between the hole injection layer and the fluorescent emissive layer; and
an electron transport layer interposed between the cathode and the phosphorescent emissive layer.