c# - Is it possible to get "contextual" gestures in Monogame/XNA? -


i working on multi-touch app using monogame, multiple users can work on larger multi-touch screen separate documents/images/videos simultaneously, , wondering if it's possible make gestures "context-aware", i.e. 2 fingers pinching document on 1 side of wall shouldn't affect panning other side of wall.

the way monogame works is, input points translated gestures, can read using:

if (touchpanel.isgestureavailable) {      var gesture = touchpanel.readgesture();      // stuff } 

is there way make gestures limited point on screen, or need implement myself? example, looking @ source code, appears touchpanelstate class work, unfortunately constructors internal.

that feature not built in monogame, not part of original xna. you'd want more 1 'logical' touchpanel defined sub-rectangle of window. touchpanel static, hence there 1 whole game in default xna.

the news monogame own gesture recognition. so, code there, need make changes monogame.

eg...
-make touchpanel non-static class can allocated given subrect.
-add non-static versions of touchpanel methods.
-the static methods redirect singleton/instance of touchpanel, preserving old api.
...now can optionally allocate more touchpanel(s) aren't whole screen.

note: doesn't you, monogame allow have more 1 os window (on windows afaik) in case static touchpanel first window , there separate api accessing touch input / gestures additional windows.


Comments

Popular posts from this blog

python - mat is not a numerical tuple : openCV error -

c# - MSAA finds controls UI Automation doesn't -

wordpress - .htaccess: RewriteRule: bad flag delimiters -