A mobile app usability testing solution consisting of custom applications on Jetson, connected external cameras equipped with a microphone, and custom streaming software installed on test participants’ mobile devices.
The solution provides a real-time first-hand understanding of how users behave in the real world, and of their mobile experiences while playing a location-based AR game or interacting with another mobile product. It does so by simultaneously live-streaming their phone display, angle and voice, as well as sending positional data, device statistics, and other data streams to the testing team. Its key components are:
A software package installed on test participants’ wearable iOS or Android devices to live-stream their device display and capture statistics, including CPU/GPU usage, temperature, and GPS data on the user’s location
An application on Jetson to connect to the mobile device. It fetches configuration data to create live audio and video streams, and is also responsible for encoding and saving copies of the data streams as a backup
A separate configuration application on Jetson that acts as a supervisor of the above application and keeps separate logs so that they can be recoverable if a stream crashes. It also creates configuration files for the above application to adjust the streaming parameters
Implementation of a pipeline for a smooth integration of all input data streams into one modeled stream