What happened?
This week, the main task I participated in was the design and development of the minimum viable product (MVP). In class, we studied Dropbox’s early case videos and the differences between PoC, prototype and MVP. This gave me a systematic understanding of the concept of MVP. Afterwards, the team conducted the first feature prioritisation around our entrepreneurial idea and decided to build a simplified online platform.
In the team, I was responsible for investigating user needs and participating in the preliminary design of the Landing Page. At the same time, I was responsible for assisting in collating the comparison of competing products for subsequent A/B testing and user feedback collection.
So what?
At first, I thought MVP meant “launching a minimal version that can be used first”. However, through this week’s study, I realised that this understanding was too one-sided. As Blank (2021) said, the essence of MVP is not a product prototype, but a set of learning tools designed around hypothesis verification. A truly effective MVP should minimise investment and maximise learning.
In the process of discussing with the team, I also realised that the consensus on “what is the core function” is very important. One member wants to integrate too many functions at one time. However, this may mask the real user feedback signal (Ries & Euchner, 2013). This made me realise that in the early stages of a startup, “staying focused” is more strategically valuable than “full functionality”.
We used the “Testing Cards” template proposed by Bland & Osterwalder (2020) to help the team systematically record hypotheses and other content. This structured approach significantly improved our communication efficiency. This also helped us clarify “what exactly we want to verify”. We also referred to Tripathi et al. (2019)’s research on the evolution path of MVP in the startup ecosystem to further strengthen our testing awareness. This can avoid ignoring market feedback due to technology-driven factors.
Another problem is that our understanding of user behaviour is still at the intention level. As Lenny’s Newsletter (2023) emphasised, there is often an “intention-action gap” between “users say they want” and “users use it.” This suggests that we should pay more attention to real behavioural data, such as click-through rate and dwell time, when collecting feedback.
I realized that these abilities are not only crucial for the startup stage. This is also an indispensable way of thinking when taking on positions such as product manager or innovation strategy consultant in the future. The ability to build, test and iterate MVP will enhance my career competitiveness in digital product life cycle management.
Now what?
I will lead the team to launch an MVP page based on the Landing Page and conduct A/B testing with different versions of copywriting to collect user click behaviour and potential payment intentions. We will use the visitor tracking tool provided by Stripe to improve data quality. In the discussion next week, I will suggest that the team quote the PoC-Prototype-MVP path summarised by IntellectSoft (2023) to clarify the phased goals. This can prevent falling into endless optimisation.