Would it be possible to support sensors in addition to gestures?
Would it be possible to support sensors in addition to gestures?
I’m thinking of the following:
– flick the device to the right: open the sidebar
– flick it to the left: sidebar closes
– flick it down: app drawer opens
I hope you get the idea. Would be AWESOME and one really unique feature (…not that LL already has gazillions of unique features…).
]]>« Hey gang, I have installed 4.4 factory image on my N4. (Previous Post)
(Next Post) “Injured thumb mode” »
< ![CDATA[
In the next release there will be 2 new rotation change events but this will be limited to portrait and landscape. For a more complex use of sensors, I think you may use some external automation app at the moment.
That said, some sensor handling would consume less battery if handled from the launcher rather than in an external app (not on newest devices).
]]>
< ![CDATA[
Tried with Move’n’Launch, works good, but requires extra gesture first to enable recognition. But generally it works great, just as expected. Please take in consideration.
https://play.google.com/store/apps/details?id=com.probayes.movenlaunch.lite
]]>
< ![CDATA[
Seems interesting !
]]>