Book review: Sprint (Part 1 – Setting the stage)

I personally find it very encouraging to see that more and more companies go down the route of experimentation and continuous discovery. Businesses are starting to realise that committing to a single solution upfront and implementing it in the hope that it will be successful can be a very risky strategy. “Sprint – How To Solve Big Problems and Test New Ideas in Just Five Days” builds on this change by introducing the concept of a 5-day sprint in which to identify problems, explore possible solutions AND get feedback from real customers.

Jake Knapp and two of his colleagues at Google Ventures, John Zeratsky and Braden Kowitz, have successfully applied ‘sprints’ for a wide range of companies, helping the likes of Pocket and Blue Bottle to tackle difficult problems in just five days. This is how the five days are broken down:

  • Monday (day 1) – ‘Start at the end’; agree to a long-term goal and pick a problem to solve during the sprint
  • Tuesday (day 2)  Review and improve existing ideas and sketch possible solutions
  • Wednesday (day 3) – Select a solution to focus on and create a storyboard
  • Thursday (day 4) – Build a prototype, a realistic ‘facade’
  • Friday (day 5) – Learn through customer interviews

Sprint 1

Fig. 1 – Steven Nguyen “Reflecting on our first design sprint” – Taken from: https://sprintstories.com/reflecting-on-our-first-design-sprint-2d58519eefad#.ej6ivw4ma

The “Sprint” book contains a wealth of great techniques to utilise as part of a sprint. I want to do it justice and will probably devote a couple of posts to this great book. Before delving into each of the days of the sprint, let’s start by looking at ‘setting the stage’ before kicking off the sprint. It’s important to have the right challenge and the right team before you begin a sprint:

  • Challenge – As readers of my posts might know; I’m quite obsessed about understanding the problem(s) worth solving before exploring solutions. I therefore believe that picking the right problem or challenge to solve is absolutely critical to a successful sprint. Knapp, Zeratsky and Kowitz suggest three challenging situations where sprints can help: high stakes, tight deadlines or when you’re simply stuck.
  • Team – The key thing when assembling a sprint team is to get a ‘Decider’ in the team; someone who’s in a position to make important decisions. This can be the CEO or another important stakeholder. I like how the book provides a number of arguments one can use when a Decider is reluctant to get involved in the sprint (see Fig. 2 below). You should end up with a well balanced team, made up of people who can implement as well as subject matter experts (see Fig. 3 below).

On top of picking a team, it’s also important to have a designated facilitator who can manage time, conversations, and the overall process. Naturally, this can be someone from within your company or someone external. For example, I know lots of digital agencies that facilitate sprints as part of a piece of work for their clients. As much this is beneficial to the client, this also helps the agency by creating a shared and robust understanding of what’s going to be built and why.

Fig. 2 – Get a Decider (or two) involved – Taken from: “Sprint”, pp. 31-32

  • Rapid progress – Emphasise the amount of progress you’ll make in your sprint: In just one week, you’ll have a realistic prototype.
  • It’s an experiment – Consider your first sprint an experiment. When it’s over, the Decider can help evaluate how effective it was.
  • Explain the tradeoffs – Show the Decider a list of big meetings and work items you and your team will miss during the sprint week.
  • It’s about focus – Be honest about your motivations. If the quality of your work is suffering because your team’s regular work schedule is too scattered, say so. Tell the Decider that instead of doing an okay job on everything, you’ll do an excellent job on one thing.

Fig. 3 – Recruit a team of seven (or fewer) – Taken from: “Sprint”, pp. 34-35

  1. Decider – Examples: CEO, founder, product manager, head of design
  2. Finance expert – Examples: CEO, CFO, business development manager
  3. Marketing expert – Examples: CMO, marketer, PR, community manager
  4. Customer expert – Examples: researcher, sales, customer support
  5. Tech/logistics expert – Examples: CTO, engineer
  6. Design expert  Examples: designer, product manager

Main learning point: I recommend everyone doing a ‘sprint’ before committing to a specific solution. “Sprint” is a great book, with a lot of helpful guideance as to how to best solve big problems in five days. I’d argue that some of the techniques to use as part of sprint shouldn’t be constrained to a 5-day period or at the start of a piece work; I’ll outline in the coming posts how some of the sprint exercises can be used on a continuous basis.

 

Sprint 2

 

 

Related links for further learning:

  1. http://www.gv.com/sprint/
  2. https://sprintstories.com/

Priya Prakash explains about Design Principles at Mobile Academy

As part of the Mobile Academy curriculum, I recently attended a class by Priya Prakash on “design principles”. Priya is a very experienced designer and has founded Design for Change, a London-based urban experience design studio.

Priya started off the session by explaining that design principles describe the experience of core values of a product or a service. Design principles help in making decisions on your product. She referred to a great definition of design principles by Luke Wroblewski (see Fig. 1 below). The important part of Luke’s definition is that all decisions can be measured against design principles.

“Design is what you decide not to do” was one of the key points that Priya raised in this class. It’s all about doing less and simplifying things.  She talked about Spotify and Google Glass as good examples in this respect:

  1. Content first – Focus on the content, and remove any unnecessary user interface elements.
  2. Get familiar – Even though there is a clear distinction between a “lean forward” mode (Spotify desktop app) and “lean back” mode (Spotify mobile app), there’s a unified design language which has been executed consistently, irrespective of the device that you access Spotify from.
  3. Don’t get in the way – Google Glass is designed to be there when you need it and to be out of the way when you don’t. The goal is to offer engaging functionality that supplements the user’s life without taking away from it.
  4. Keep it relevant – Deliver information at the right place and time for each Google Glass user.

Priya then talked about motion user interface design principles:

  1. Personality – For example, the Pitchfork app has a magazine like feel. It’s about understanding what the content is and translating this into appropriate behaviours.
  2. Responsive – Priya talked about the Clear app as being very responsive, explaining how this app gracefully expands or contracts.
  3. Context – Motion should give context to the content on screen by detailing the physical state of those assets and the environment they reside in.
  4. Emotive – This principle is all about evoking a positive emotional response. This kind of response can be triggered by wide range of user interface elements, for example  smooth transition or a nice animation. Yelp‘s app is a good example in this regard.
  5. Orientation – Motion should help ease the user through the experience.  The “orientation” principle means that motion should establish the “physical space” of the app by the way objects come on and off the screen or into focus. The key is to get the flow of actions right, guiding the user on her journey and make sure she doesn’t feel lost or confused. Mobile apps like Yelp and Evernote do this pretty well in my opinion.
  6. Restraint – Keep it simple! Similar to the abovementioned “orientation” principle, it’s important not to bombard the user wity too much animation or confuse them with too many interactions to choose from. This is one of the reasons why I’m so a big fan of single purpose apps; I like the simplicity that they offer and the level of design restraint that they tend to apply.

Main learning point: I learned a lot from Priya Prakash’s class on design principles, particularly with respect to motion user interface design principles. Design principles can provide valuable guidance for the design of any software product or service and should therefore not be taken lightly. Thanks to Priya for a great class!

Fig. 1 – Definition of design principles by Luke Wroblewski – Taken from: http://www.lukew.com/ff/entry.asp?854

“Design principles are the guiding light for any software application. They define and communicate the key characteristics of the product to a wide variety of stakeholders including clients, colleagues, and team members.”

“Design principles articulate the fundamental goals that all decisions can be measured against and thereby keep the the pieces of a project moving toward an integrated whole.”

Fig. 2 – What makes a good design principle? – Taken from Priya’s lecture at the Mobile Academy on 14 October ’14:

  • Specific enough to help make a choice
  • Focuses the team – avoid being broad
  • Measurable against user need or product/business goal

Related links for further learning: 

  1. https://developers.google.com/glass/design/principles
  2. http://www.theguardian.com/business/2014/aug/03/inside-spotifys-redesign
  3. http://www.lukew.com/ff/entry.asp?854
  4. http://pitchfork.com/news/52898-introducing-pitchfork-weekly-our-new-app/
  5. http://www.beyondkinetic.com/motion-ui-design-principles/
  6. https://www.vitsoe.com/gb/about/good-design
  7. http://www.uie.com/articles/creating-design-principles/
  8. http://www.slideshare.net/goldengekko/mobile-apps-design-trends-2014

App review: Do

I guess we all know how frustrating it can be to have to sit in meetings that just feel like a waste of time or that could have been dealt with in 30 minutes (instead of 3 hours). I know that there are quite a few apps out there which help us to run more productive meetings, but I decided to focus on Do:

  1. How did this app come to my attention? – I got an alert from Product Hunt about Tools for Product Managers, promising me a list of “the tools the pros use”. Do was only ranked 10th on this list, but I guess it was this comment from one of the Product Hunt voters, that intrigued me the most: “I was a Yammer PM. Do.com is the meetings platform I wished I had.” Especially given that it came from a guy who used to be at Yammer – who are all about collaboration within the enterprise – this comment made me want to find out more about the product.
  2. My quick summary of the app (before using it)  Do helps you to have more productive meetings; I therefore expected a tool which helps its users to make their meetings as efficient as possible. The tool doesn’t yet seem to be available on iOS or Android, only on PC.
  3. Getting started, what’s the sign-up process like?  I have to sign up to use Do. At present, Do only seems to support Google users; all non Google users will be notified as soon as they will be able to sign up (see Fig. 1 below). Once I’ve selected my Google account, I get presented with a permissions screen (see Fig. 2 below). I click “Accept” and my personal dashboard appears. All fairly straightforward.
  4. How does the app explain itself in the first minute? – The default page of my dashboard shows a simple timeline with meetings on the relevant dates and times (see an example in Fig. 3 below). To be honest, I felt a bit underwhelmed at first , thinking “is this it!?”. However, the subsequent overlay which consisted of six ‘how to’ screens was quite useful, explaining in a simple but effective way how to best get started on Do (see Fig. 4 below).
  5. How easy to use was the app? – Using the tool felt very intuitive and easy. The layout of the dashboard is clear and easy to understand. Adding a new meeting to the dashboard felt no different to doing the same thing in Google or Outlook (see Fig. 5 below).
  6. How did I feel while exploring the app? – Like I mentioned above, exploring Do felt incredibly easy and intuitive. The signposting used in the tool is self-explanatory and the navigation options have been kept to a minimum. A quick click-through on an individual agenda item highlighted a key purpose of Do; the ability to create and share a meeting outline, making it easy to collaborate around meeting goals and agenda items (see Fig. 6 below).
  7. Did the app deliver on my expectations? – Yes, it did. I felt a bit underwhelmed at first, expecting Do to provide more, ‘less obvious’ features. However, whilst playing with the application, I discovered features like “Invite” and “Takeaways”, which I believe are missing from most standard diary / meeting applications.
  8. How long did I spend using the app?  A few days to start with, but I expect to be using it a lot more in the future!
  9. How does this app compare to similar apps? – I had a quick look at MeetingHero which serves a similar customer proposition to Do. At a first glance, MeetingHero seems a bit less advanced and intuitive in comparison to Do. MeetingHero is, however, available as an app on iOS which means that the app can be used on the go.

Main learning point: Do is a straightforward and easy to use meeting app. I like its interface and its key features; the app makes collaborating around meetings very easy. It will be interesting to see how Do will perform in already crowded marketplace, with apps and systems that enable similar things. I’m now curious to see what the mobile version of the application will look like!

Fig. 1 – Screenshot of Do’s sign-up screen

Do 1

 

Fig. 2 – Screenshot of Do’s permission screen

 

Screen Shot 2014-09-30 at 05.09.57

Fig. 3 – Screenshot of sample meeting in my meeting calendar in Do

Screen Shot 2014-10-05 at 08.41.29

 

 Fig. 4 – Screenshot of one of the introductory ‘How to’ screens on Do

Screen Shot 2014-09-30 at 05.22.54

 

Fig. 5 – Screenshot of functionality in Do to create a meeting 

Screen Shot 2014-09-30 at 05.36.59

 

Fig. 6 – The ability  to share a meeting goal and agenda items

Screen Shot 2014-10-04 at 13.23.52

 

Twitch and its appeal for Google and Microsoft

The other day, I heard about the rumoured takeover of Twitch by Google for the handsome amount of $1 billion. I have to be honest; up until that point I had never heard about Twitch. Reason enough to look into Twitch and a possible ratio for Google willing to spend such a large amount of cash on this startup:

  1. What is Twitch? – Twitch is a video streaming platform and a community for gamers. Geekwire describes Twitch as “the ESPN of the video game industry” and says Twitch is a leader in that space. Twitch has over 45 million monthly users and about 1 million members who upload videos each month. In a relatively short space of time (Twitch was launched in June 2011), Twitch has successfully created an online streaming platform for video games.
  2. Who use Twitch? – I’m not an avid video gamer myself, but browsing the Twitch website tells me that are in effect two main user roles, which are closely intertwined: game players and broadcasters. Clearly, you can be both and I’m sure that a lot of Twitch members fulfil both roles. One can play games on Twitch channels like Counter-Strike: Global Offensive or World of Tanks or one can create their own pages from which you can broadcast games. A great example of Twitch’s success in engaging its community around a game is TwitchPlaysPokemon which has had over 78,000 people playing a game that turns chat comments into controller inputs, parsing hundreds of thousands of ups, downs, and starts and translating them into in-game movements.
  3. Why is Twitch such an interesting acquisition target? – Twitch is reported to have snubbed Microsoft’s takeover offer but is rumoured to have fallen for Google. This raises the question as to what makes Twitch such an interesting takeover target? I think that the answer can be split into two main factors. Firstly, scale. Twitch has a rapidly growing and very engaged user community who all share a passion for (video) gaming. Secondly, live broadcasting. Going back to the example of TwitchPlaysPokemon, Twitch streams games that get people excited and gets them participating in real-time. This simultaneous element is something that for instance YouTube is lacking. YouTube is great for on-demand video content, but (currently) less so for live event coverage or participation. The combination of both factors (as well as a very rich vein of user generated content and data) makes Twitch an extremely interesting target indeed.

Main learning point: Recently there have been some major takeover deals in the digital industry – think Instagram, WhatsApp and Beats – but the rumoured acquisition of Twitch by Google is interesting for a number of reasons. If I have to highlight one key reason, then synergy is the main aspect that makes this potential takeover sound like a very exciting one. How will Google potentially integrate YouTube and Twitch or at least find a way to combine both platforms? Will the acquisition of Twitch help YouTube in cracking the real-time broadcast element of its offering? Lets wait and see if the deal actually gets done in the first place, but if it does then I will definitely keep an eye out for any future developments involving Google, YouTube and Twitch.

Related links for further learning:

  1. http://www.fool.com/investing/general/2014/05/28/why-youtube-buying-twitch-for-1-billion.aspx
  2. http://www.geekwire.com/2014/microsofts-xbox-shot-potential-twitch-buyer-matters/
  3. http://variety.com/2014/digital/news/youtube-to-acquire-videogame-streaming-service-twitch-for-1-billion-sources-1201185204/
  4. http://techcrunch.com/2014/05/27/tc-cribs-tours-twitch-tv-gaming-office-headquarters/
  5. http://thenextweb.com/insider/2014/05/29/twitch-now-lets-filter-counter-strike-global-offensive-game-streams-map-skill-level/
  6. http://thenextweb.com/google/2014/05/19/google-reportedly-wants-buy-video-streaming-service-twitch-1b-deal-boost-youtube/
  7. http://variety.com/2014/digital/news/why-google-wants-to-hitch-twitch-and-youtube-1201188093/
  8. http://www.theverge.com/2014/5/20/5734108/why-twitch-could-be-the-best-billion-google-ever-spends
  9. http://guardianlv.com/2014/02/twitch-live-broadcast-to-be-included-in-xbox-one/

Twitch

Some considerations regarding data-driven design

Whilst slightly struggling with identifying the most effective measurements for my own product I’m nevertheless learning lots of new things about when (not) to use data in developing products. I’ve been learning more about data-driven product design and some of the key things to consider when using data to inform your product decisions:

  1. Measuring events – What are the key events that we would like to track, and why? This is likely to vary per team or stakeholder. For instance, the number of events that Sales are most interested in is likely to be much smaller than the events that I, as a Product Manager, want to track (see Fig. 1 below). I guess the main thing is here that you pick the appropriate events to measure and set clear goals as to how you would like these events to perform. A good example is Wooga, a German games company, where the product team have a number of KPIs and metrics that they’re looking to deliver on. Each week they’ll pick a KPI e.g. retention and will look at all the activities they can design and measure to increase the chosen KPI. Alistair Croll and Benjamin Yoskovitz have introduced the notion of  the “One Metric That Matters” in this respect, urging businesses to focus on a single metric that will really impact their business.
  2. Retention vs customer engagement – I like to distinguish between customer engagement and customer retention. Often they tend to get lumped together, but in my mind retention is all about if and how often people revisit your site or application. For instance, I find it helpful to do a cohort analysis to compare the number of users that signed up during specific time periods and their revisit rates. These figures should be a good reflection of site performance over time, and the idea being that revisit rates will go up as you continue to improve the site (blogger Andrew Chen has written a great blog post about this). With customer engagement, I tend to be much more interested in metrics such as click-through rates, conversions or discussions around content. Such metrics give a better insight into the extent which users engage with content or a service.
  3. Limitations of A/B and multivariate testing – I’m a big fan of testing multiple versions of a design, whether it’s just to compare two design versions (A/B) or to compare multiple variations of different design elements (multivariate testing). Again, the main challenge here is to ensure you’re testing the right things. You can potentially test a thousand different variables and combinations per web page or application, so I believe it’s critical to start off with the right business questions and to be disciplined about the things that you want to test (see Fig. 2 below).
  4. Data isn’t everything – Whether the data you generate is quantitative or qualitative (or both), I strongly believe that data doesn’t replace product vision or intuition. Data provides a very useful perspective when making a decision, but it shouldn’t be the only factor you’re considering. I know that a lot of my peers disagree with this view, but I’ve identified some constraints over time when it comes to relying on data (see Fig. 3 below). Essentially, I believe that data provide a very valuable lens to look at product performance but data can never be a substitute for ‘going out of the building’ (and talking to customers or competitors) or gut feeling. Data can help in validating intuition or initial assumptions but you’ll need to start somewhere!
  5. Who are you measuring? – I’m learning more about user segmentation and how this can be reflected in the specific things you measure. I found the “See-Think-Do” framework by analytics guru Avinash Kaushik (see Fig. 4 below) very helpful in this respect. It helps to think about specific metrics to measure in relation to specific groups of users. I always find it very helpful to look at analytics within the context of user cohorts, just to get a better perspective.

Main learning point: data can provide a great framework for making business or product decisions. There are numerous professionals and companies out there who make decisions solely based on data. Data are objective and tangible. However, the pitfalls of solely relaying on user data shouldn’t be underestimated in my view.

Firstly, one can easily end up measuring the wrong thing or getting an incomplete picture. Secondly, one can become paralyzed by data, not trusting your product vision and becoming very driven by the users that you have (and not necessarily the users that you want).

If anything, I’ve realised again the importance of establishing a product vision, goals and assumptions first, before you even start contemplating which metrics to measure!

Fig. 1 – Examples of event types that I tend to track

  • Registration landing

  • Registration completion

  • Products entered in based

  • Proceed to checkout

  • Account creation

  • Page view

  • Click-through on search results

  • Documents created

  • Document shared

  • Post created

  • Post updated

  • Post shared

  • Comment created

  • Purchase

  • Visits by a specif group of visitors

Fig. 2 – Things I’ve learned so far about A/B and multivariate testing

What is it that you want to test?

I’ve learned to be very specific about the business questions that I want answers on:

  • I want to improve the conversion rate of the basket checkout process by 5%

  • The aim is to get a 1,000 people to respond to the call to action in the email campaign

  • To see if the number of duration of time spent on a page drops if we introduce page ads

  • To create a shift from “spectators” to “participators”

  • To increase employee productivity from 2% to 5%

  • To convert the current 200 “inactive users” to “active users”, by having them complete at least one activity per month

What does success look like?

This is where I look at Key Performance Indicators and the related metrics to measure and test with:

  • “We know that the new landing page of our site or app is successful if we can reduce the bounce rate by 10%”

  • “We know that the new sign-up process is successful if we manage to increase the sign-up rate by 10% in the first week after launch of the new, simplified sign-up process”

  • “We know that the new “share an update” button is more effective if we see a 5% growth in month-on-month number of status updates shared.”

  • “We know that our user engagement is improving when the user who created a group are still creating groups in March”

Being disciplined about goals and picking the right variables

I’ve learned to try and stick one goal per page to test. Otherwise there’s a risk of things getting messy, making it hard to measure things and – most importantly – to get any meaningful outcomes for your testing in the first place.

Picking the right variables is another key thing. Which elements tend to cause the most friction (e.g. forms, sign-up, page length and process steps)? Which elements are key in achieving your goals? Also, make sure you don’t waste too much time on trivial elements such as text or headlines, because in the context of the key goals that you’re trying to achieve they’re likely to have less of an impact.

Fig. 3 – Some constraints when it comes to relying on data 

  1. Data provides an insight into the ‘what’ but not necessarily into the ‘why’ or ‘how’ – Particularly quantitative data can be great when it comes to monitoring incremental change but is quite limited in providing real customer insight or show which new features create a breakthrough change. I therefore believe that data always needs to be augmented by other perspectives such as user feedback, competitor analysis, etc.
  2. You still need creativity, strategy and intuition (1) – I’ve seen the risk of people thinking that their analytics data were the holy grail, succumbing to ‘analysis paralysis’ or becoming risk adverse, being unable to make decisions without any available data. One could argue that this isn’t necessarily the fault of data but that of the decision making process around it. My point is that data is one – important- source of information to base a product decision on, but it shouldn’t be your sole perspective. For instance, there might be internal business or technical aspects that need to be considered.
  3. You still need creativity, strategy and intuition (2) – Also, customer monitoring can be very reactive, in a sense that you’re following your customers through their experience and optimise accordingly. However, you might want to drive to a specific strategy or have a new product (the relevant metrics to measure might only emerge over time). Ultimately, you will need to be creative or take a leap of faith to get the results that you want.

Fig. 4 – The “See-Think-Do: Measurement Strategy” framework by Avinash Kaushik

Related links for further learning:

  1. https://marcabraham.wordpress.com/2013/05/03/book-review-lean-analytics/
  2. http://altmetrics.org/manifesto/ http://www.wooga.com/2012/07/
  3. http://www.slideshare.net/wooga/metrics-driven-designdeveloper-conference-hh2012stephanie-kaiserjesper-richterreichhelm
  4. http://www.youtube.com/watch?v=MQcjAVY4xgk http://www.infoq.com/interviews/kaiser-metrics
  5. http://www.90percentofeverything.com/2011/01/06/local-maxima-and-the-perils-of-data-driven-design/
  6. http://www.wired.co.uk/magazine/archive/2012/01/features/test-test-test
  7. http://www.smashingmagazine.com/2010/08/26/in-defense-of-a-b-testing/
  8. http://37signals.com/svn/posts/2431-basecamp-home-page-redesign
  9. http://andrewchen.co/2008/09/08/how-to-measure-if-users-love-your-product-using-cohorts-and-revisit-rates/#
  10. http://www.forentrepreneurs.com/customer-engagement/
  11. http://www.kaushik.net/avinash/see-think-do-content-marketing-measurement-business-framework/
  12. http://www.kaushik.net/avinash/lean-analytics-cycle-metrics-hypothesis-experiment-act/
  13. http://insideintercom.io/the-problem-with-data-driven-decisions/
  14. http://andrewchen.co/2009/03/02/does-ab-testing-lead-to-crappy-products/
  15. http://www.webdesignerdepot.com/2013/05/the-perils-of-ab-testing/
  16. http://tech.onefinestay.com/post/43394522139/ab-testing-the-checkout-less-is-more