Please be aware that this was initially a question about how to get the input to work. A comment pointed me in the direction of a fix, but only left me with more questions in regards to understanding the problem I was originally having, so I have edited my question to include the initial solution and focus on the question I still have.
The Problem:
Recently, I have moved from using GUIText and GUITexture objects to using the Canvas to display UI elements. In effort to set up a prototype, some external GUI elements were imported, and when the game is built and run from my phone, everything works as intended.
However, if I simply run the game from inside Unity, input is completely ignored. I originally thought this may have been a response to grabbing touch screen input, and expecting it to work on a regular screen with mouse input, but I have since confirmed that it still does not work when using Unity Remote to test the game on my phone.
It has since been pointed out that I was not using an EventSystem, along side my Event Trigger objects. After implementing an EventSystem component, and the default input module supplied by the EventSystem interface, the input works. However, this still leaves me confused. What is the point of enforcing the implementation of the EventSystem, if it only ensures functionality during play test? Why would I want this system in my end build, when I can confirm that the build version functions properly, without it?
Below you will find my setup, contextual to the problem. Click for a zoomed in view, as I have included the component structure of my UICanvas and MainCamera objects, in case it is helpful. In this case, we have a button that increases the max speed, when pressed. Everything works in a build, but nothing works during a play-test.
Additional Information:
- I have enabled 'show touch input' on my phone, to confirm that the phone is still registering touch input.
- I moved to using the Canvas object after the 'depreciated warning' for using GUIText and GUITexture turned into a 'depreciated error'. Even if there is a suitable way to still perform these actions with the GUIText and GUITexture objects, I still wish to persist with using the Canvas. I am also personally curious as to why this issue exists within the Canvas, as it seems lead to obvious cases of severe inefficiency, in a system that seems to have been put together with the sole point of being more efficient.
- I can create manual OnMouseDown buttons that will work during play test, however, some GUI elements are more complex and persisting with this sort of solution seems like an incredible waste of time.
- As previously mentioned, I do not have any input issues if I simply build and run the game from my phone. However, this takes considerably more time than a simply debug test, and even without the high efficiency cost, I do not have access inspector-manipulation during my play-test.
- I am using Unity version 5.3.5 personal. Any solution that does not include updating Unity will be well preferred. While I have no physical problems updating Unity, in some previous projects updating Unity versions has lead to massive issues that we would prefer to outright avoid, if possible.
TL;DR:
Unity will not accept input on my input canvas during debug. I literally have to build and run my project each time I want to test input, or create dodgy hacks to provide input from the Inspector. The addition of an EventSystem allows input during debug play-testing, but is not needed for input post-build. Why does Unity act this way, and why do I need an EventSystem entirely for play-test input?
Answer
Given the consistency in user testing, and the failure to replicate this odd behaviour, I am content with leaving this down to a bug.
If anybody else experiences this problem, rest assured that it does not seem to pose any additional problems. Four months later, and there have not been any apparent side effects in our game, apart from a very confusing initial debug phase.
No comments:
Post a Comment