Would it be possible to support sensors in addition to gestures?

Would it be possible to support sensors in addition to gestures?

I’m thinking of the following:

– flick the device to the right: open the sidebar

– flick it to the left: sidebar closes

– flick it down: app drawer opens

I hope you get the idea. Would be AWESOME and one really unique feature (…not that LL already has gazillions of unique features…).

]]>

3 Commentsto Would it be possible to support sensors in addition to gestures?

  1. Anonymous says:

    < ![CDATA[

    In the next release there will be 2 new rotation change events but this will be limited to portrait and landscape. For a more complex use of sensors, I think you may use some external automation app at the moment.


    That said, some sensor handling would consume less battery if handled from the launcher rather than in an external app (not on newest devices).

    ]]>

  2. Anonymous says:

    < ![CDATA[

    Tried with Move’n’Launch, works good, but requires extra gesture first to enable recognition. But generally it works great, just as expected. Please take in consideration.



    https://play.google.com/store/apps/details?id=com.probayes.movenlaunch.lite

    ]]>

  3. Anonymous says:

    < ![CDATA[

    Seems interesting !

    ]]>

Leave a Reply

Your email address will not be published. Required fields are marked *