What happens under the hood when touch surface is connected?

Random discussions.

What happens under the hood when touch surface is connected?

Postby bombarie » Tue May 23, 2017 11:49 pm

My question is related to an issue that I'm having not so much with TouchScript but with an embedded browser library I'm using, but I wonder if the issue might be connected to TouchScript somehow. Bear with me, and I apologise if I'm severely off-topic.

I'm using the asset Embedded Browser asset to show website content in info panels that will appear and disappear as the user requests them. The Touchscript library is doing a fantastic job in moving and interacting with the containers surrounding the browser instance but there's a situation where I stop being able to interact with the browser instance and I'm wondering if in any way Touchscript is doing something under the hood that might be involved because I can't find anything yet in the whole Embedded Browser asset. Here's what's going on, starting with some context:

  1. I'm using a 70" PCAP multitouch-sensitive screen. The multitouch input is provided to the computer through a usb cable
  2. The Embedded Browser asset maps the mouse pointer to an x/y coordinate in its browser instance frame and transports mouse button clicks to the browser input as well. I've extended that interface class to take Touchscript PressGesture and ReleaseGesture Gestures to set the left mouse to true or false and a TransformGesture to provide an alternative x/y coordinate. Ie. I'm faking the mouse input and as far as Embedded Browser knows it's receiving mouse input.
  3. The above solution works both in the Editor and in the Standalone Player, BUT... only if I have the touchscreen's usb cable unplugged. AND... only when the usb cable is unplugged when the browser instance is created. Once it's on the screen I can connect the usb cable, Windows does its little 'found hardware' sound and I can successfully interact with the browser instance through Touchscript. However, If the usb cable is plugged in when I instantiate the browser instance, no dice. Through debug logging I know that the asset is getting touch input data.. it just isn't resulting in actual browser interaction in the circumstances described here.

If you're thinking of telling me that the smoking gun is pointing fully to the Embedded Browser plugin I would agree with you, however I've combed through its code and I don't see any if-then clauses or switch cases that react to having underlying hardware detected. I've contacted the developer and he's also confused. I've started looking through Touchscript sources, so far obviously no insights. That's why I'm asking: Does Touchscript actively detect underlying touch-input surfaces? I'd love to know where that happens, hopefully that might help me bug-hunt better.

Once again sorry if this is vastly off-topic. Hopefully it will spawn some useful insights for you and for me.

ps -> in the Standard Input panel I have the Win8+ and Win7 API's set to 'Unity', but am I correct in thinking that these settings are only relevant to standalone builds?
Posts: 1
Joined: Tue May 23, 2017 11:15 pm

Re: What happens under the hood when touch surface is connec

Postby valyard » Thu May 25, 2017 12:34 pm

Hard to say remotely.
So you are saying that the gestures on the browser object do get input events, right? This means that the TouchScript input works.
Can you set a break point in the code where the input goes to the browser component to check if the properties of the resulting touch objects come from TouchScript are different depending when you plugged in the touch panel?
Site Admin
Posts: 364
Joined: Mon Sep 08, 2014 11:57 pm

Return to General Discussion

Who is online

Users browsing this forum: Google [Bot] and 2 guests