That’s Right Unity Is Not Dealing With This Critical Issue
Unity are one of the biggest, if not biggest 2D & 3D game engines on the market. The company supports every conceivable, mainstream platform from mobile, thru desktop, Web to Virtual / Mixed Reality headsets.
This is the company that Apple & Google speak to when they want to role out ARCore & ARKit. By the time WWDC or Google annual conference role around, there is a beta version of Unity waiting in the wings with support for these changes.
If you have used a mainstream VR/MR app on Oculus Rift, HoloLens or HTC Vive then the likelihood is, that it was built in the Unity Editor.
Over the last year, a lot has changed for the company, especially instead of major version roll-outs, each subversion being released into Beta isn’t just bug fixes and iterations. Instead, these Beta sub-versions contain major new features and Unity isn’t shy about being transparent with their roadmap.
The big news is that Unity are making major inroads into the movie industry. They teamed with Neil Blomkamp to create the amazing animations for his OATS Studios movie shorts.
Many pundits believe that Unity is preparing themselves for an IPO with these changes, several major new strategic alliances and changes to the company board.
What Does Unity Need To Fix?
Ultimately, Unity provides a software development platform in their Editor. Code is written in C# and subject to the platform you are building your game or app on, the code is converted (e.g. conversion to C++ for IOS via IL2CPP). Notwithstanding the inability of the Editor to provide Developers with a mechanism to select all the platforms needed and run a cross-platform build script; this is not the major failing.
QA can preview the scenes in the Editor but, Unity does not support Automated UI Testing on devices.
Once you have built your apps, you have no way except hands and eyeballs testing (more commonly argued over as “manual testing”) to test whether your IOS / Android / WebGL app functions or delivers the UI as expected.
Anyone who works in QA knows that standard practice is to incorporate automated UI tests using frameworks like Selenium for web testing and or, Appium for mobile apps.
These frameworks rely on the ability to recognize and map UI elements and objects with the app UI but Unity apps are a Black Box as far as Selenium or Appium are concerned. If you can’t map the UI elements and objects, then you can’t script clicks, swipes, text inputs or other simulations of real user behaviors.
This leaves game and app makers with 3 alternatives:
Manual testing alone is labor intensive, time consuming and repetitive. Cost aside it depends on skilled testers with the ability to catch and report bugs.
Customer testing is an oxymoron and often a disaster waiting to happen and yet some companies have no issue releasing their applications to their customers after only catching the critical issues.
Crowdsource testing is a good interim solution where companies lacking the testing personnel and is done by paying a 3rd party crowdsource company to deliver the warm bodies needed to test on personal devices for what amounts to first-to-find bug bounties.
The Warptest POV
Over the last year, I along with one of my QA Engineers tested several so-called automated testing solutions for Unity apps. Most didn’t make it out of the starting gate. Others showed early promise but needed extensive investment in development and testing to be anything more than a proof-of-concept.
All you need to do is search Unity’s Community Forums to see this is in high demand. Many companies and Unity personnel I spoke to online were interested to hear what we had discovered but if Unity want their community of 2D / 3D game, application and movie animation makers to deliver robust, well-tested products then automated UI testing needs to happen.
Today at 6.30pm Pacific Time, GDC, the Game Developers Conference kicks off. Don’t disappoint me Unity.
This is Unity’s summary blog post of their keynote at GDC. Color me disappointed, no mention of automated UI testing. Now I get it, automated testing in VR is a big challenge but choosing between doing nothing and at least supporting web / mobile automated testing on device, the choice is simple. FWIW if I had to choose between Unity and a platform that supports automated testing, the choice would be simple.
This is me throwing down the gauntlet Unity.