Search Issue Tracker
By Design
Votes
0
Found in [Package]
1.16.0
Issue ID
ISXB-1765
Regression
Yes
Simulation Touch with the New Input System does not work compared to older versions
How to reproduce:
- Open the “2494927.zip“ project
- Open the “SampleScene“
- Open Window → Analysis → Input Debugger
- Enable Options → “Simulate Touch Input from Mouse or Pen”
- Enter Play Mode
- Click the Button
- Observe the Console
Expected result: Logs about recurring simulated clicks are logged
Actual result: Nothing is logged
Reproducible in: 1.14.1, 1.16.0 (6000.0.62f1, 6000.2.13f1, 6000.3.0f1, 6000.4.0a5, 6000.5.0a1)
Not reproducible in: 1.14.0 (6000.0.62f1, 6000.2.13f1, 6000.3.0f1, 6000.4.0a5, 6000.5.0a1)
Reproduced on: Windows 11 Pro (24H2)
Not reproduced on: No other environment tested
Add comment
All about bugs
View bugs we have successfully reproduced, and vote for the bugs you want to see fixed most urgently.
Latest issues
- Shader Graph Node information is briefly displayed in Graph Inspector when clicking on Category in the Blackboard
- Module installation fails with "Download failed: Validation Failed" errors when using beta.2 Hub version
- JsonConvert conversion fails trying to call GetCallbackMethodsForType when [OnDeserialized] is used in a class
- Shader Graph Category dropdown cannot be expanded/collapsed when clicking on the text
- Different text alignment in the column header in Entities "System" window
Resolution Note:
Question 1, if the user wants to simulate Touchscreen interaction, they don't need to enable "Simulated Touch Input from Mouse And Pen". That's just if the user wants to indeed use a Mouse and Pen to simulate Touchscreen interactions. What this is mode does is that it disables Mouse and Pen devices (so they won't receive Mouse events) and "routes" them through the Simulated Touchscreen device. The class TouchSimulation in the Input System package contains more information about the implementation.
Question 2: The fact that this worked in 1.14.0 was due to a regression for fixing another bug. The side effect allowed the user to do what they wanted. 1.13 doesn't allow it for example. Technically, the behavior regressed in 1.14.0 and was fixed in 1.14.1
Question 3: If the option is "Simulate Touchscreen with Mouse and Pen" then yes. Again, this option is to interact with the Mouse/Pen. There's nothing preventing users to at their own devices and queue input state events at runtime. However, they should have understood that there's an impact on both UIInputModule and PlayerInput components in the Event System. For instance, if the user is using PlayerInput components, it's wise to add the respective devices to the control schemes of the Input Actions Asset. In the example, only the Keyboard & Mouse are set, which will not work as expected when using a Touchscreen. The Touchscreen device needs to be added to the control scheme; otherwise, it won't work as the user likely wants. I'd suggest looking at the default project-wide actions to understand more about the UI action and its control schemes.