I am one week into my first Beta-Test so it's a good time to share some immediate observations!
On the face of it this is an elegant simple solution for inviting and managing up to 2000 external testers. To date I have 114 registered testers in the system and everyone who has attempted to install the game has managed so with no problems. The testers cover 7 countries across Europe, Asia and America, and many combinations of device and iOS.
HOWEVER it has proved very unreliable in two important areas - sending email invites and tracking installations. Let me explain.
20% of the testers I put into the system fail to receive their instruction emails from Apple. But the TestFlight dashboard tells me that every tester has been Notified. It is plainly wrong! This isn't just a case of emails lost in Junk folders - I checked. This is a real glitch in the Apple-TestFlight system.
And I've read a lot of noise about this in forums too - I don't believe its just my account that has been unlucky!
So I've had to resort to validating every tester by contacting them individually. 20% is a big number - And this has added a lot of unnecessary time to an already busy day. Some folks I have had to put into the system 3 or 4 times before they finally get their email. So be prepared for this by following up with your would-be testers soon after registering them. Let's hope Apple's engineers can get this fixed quickly...
Every tester on the dashboard is marked as either 'Notified' or 'Installed' to show which stage of the process they are at. At the time of writing I have 6 testers who have definitely installed and played the game, but the system still has them in the 'Notified' state. It seems that the 'Installed' state can appear several days after the installation actually happened! In a world where we are monitoring analytics on a daily basis - sometimes hourly - this is highly annoying. And when coupled with the first problem you are left really not sure of whether your testers have received their invite / installed their build / are already playing - until you hear from them!
PUROSE OF THE BETA-TEST
I figured it was important to have some measurable goals for my beta-test. It's only by having these that I can judge the success of the test and make decisions on next steps. The nature of the goals also determined how many testers I actually need. In my case I figure 100 testers would be sufficient.
I have 3 goals, in this order of priority:
1 - To identify all major bugs that impact stability, key functionality, user satisfaction.
2 - To see if the game is perceived to be too hard or too easy
3 - To see if the D2, D7 and D30 retention figures are strong enough to go to Soft Launch.
I've worked in the gaming industry for 20 years and am therefore lucky enough to have a pretty wide network of people to ask to beta test. Facebook was my predominant way to reach them, and I've just tried Linked-In too. In order to accelerate the spread of awareness I created a Facebook Page and a promo-video which you can see on this site. Responses from this were pretty immediate. Of the 114 testers registered today (1 week after this started), 70 of them had registered on day 1!
I also wanted to have a good proportion of testers who did not know me; whose opinions / reactions would be unbiased by any friendship or history. Having existing contacts share my story and my video on Facebook was a great way to achieve this. Also my girlfriend brought in many testers from her female clientele (she works in the beauty business). As a result over 1/3 of the testers are people I have never met. I feel this will add to the validity of any learning from the beta-test.
The beta test is a measure of the game itself, so I didn't want to skew the retention numbers by promoting engagement in ways that the regular app would not benefit from. That said, for the test to even work, I needed people to at least play the game for a while so they could contribute to the feedback.
So I manage a mailing list of all beta testers and send them a sort of progress update every 2-3 days. I've attempted to create a sense of community by sharing what bugs people are seeing and what features they are requesting. This has prompted a lot of good email discussion allowing me to learn in more detail what people are experiencing in their first few days of playing. I also allow people to unsubscribe to this mailing list. So far only 1 person has opted out :-)
This is perhaps the hardest part of beta-testing. It has been amazing to see the wide range of views expressed. Some people are finding the game so easy that after 3 or 4 levels they are ready to leave. Others complain that after just 3 or 4 levels the game has become too hard and they are ready to leave!! Some say they are addicted, the difficulty is just right, and they are playing every day. Some LOVE the music, some HATE it. Some LOVE the graphics, some really think they SUCK.
Reading all the individual opinions can quickly become overwhelming and make you confused about what are the best next steps. My first inclination was to 'make everyone like the game'. But the reality is this is impossible. No game can achieve that. Even the most successful games lose more than 50% of their users after day 1.
To solve this conundrum I am employing two approaches. The first is to understand who my target audience is, and the second is to look at the DATA, not the OPINION.
By definition Smart Numbers is trying to target that group of players who want a more challenging and cerebral gameplay experience. The Sudoku crowd. It's a game requiring focus and concentration. There are many gamers who play just to chill - they do not want to concentrate. Using a food analogy there are those who want a gourmet meal and those who just want to chew gum. Smart Numbers is for those with a big appetite and a love of food! With this in mind I try to understand the preferences of each tester and then promote the importance of feedback from those who seem to fit the profile of my target audience. Conversely, for the gum-chewers, I put any of the their feedback that pertains to game difficulty into some perspective.
Data, data, data
The real beauty of digital gaming is its ability to give us real-time data about how our audience actually plays. To ignore this is a recipe for failure! Being aware of this I built some analytics into the game before sending it out. I am using Unity Analytics primarily for its ease of integration and also because the options it gives seemed to cover everything I need.
Unit Analytics gives you a bunch of stuff out of the box:
- DAU (daily active users), MAU (monthly active users), and therefore stickiness (DAU/MAU)
- Retention, Day 1, Day 7 and Day 30.
- Number of sessions per day and per user.
- Total time spent in game per day and per user.
You can also create custom events. i am tracking:
- % of players who progress beyond tutorial
- % of players who progress past each level, up to level 30 (This is called a FUNNEL)
Things I am not tracking, but now wish I was!:
- % of players who go to the OPTIONS screen.
- % of players who sign-up to Facebook
- % of players who use the rewarded video to suspend ads feature.
- Number of retries of each level before winning.
My advice is to think super-hard about what data is valuable to your game and make sure you have analytics to track it, before going to beta. If possible, brainstorm this with other people - it's the best way to see your game from a different perspective.
I also use Facebook's mobile cloud service, Parse, to keep track of players' scores per level. (This is being retired in January 2017 so I DON'T recommend you use it now!). Tracking these scores is allowing me to judge the relative difficulty of the game to achieve grades C thru A+.
After 14 months of working in isolation it has been a wonderful experience to release the game into the real world and get brand new perspectives on 'my baby'!
After just I week of data I can already see trends appearing and am starting to be able to measure the game against the goals I had set.
I thoroughly recommend doing this with your game before you launch for real - the opportunity to make learnings and improvements is huge. Just think hard about the metrics you want to track with your analytics.
I will post another blog in a few weeks with the actual data from this test and the process of making decisions based on that data.