Typed

Typed is a collaborative knowledge management SaaS tool that streamlines scattered information and workflow into a single source of truth for teams to get more work done.

Overview

I joined Business Canvas when the product Typed was still in its closed-beta stage with basic features. As a Product Manager, I had an opportunity to identify our ideal customer profile, conduct over 50 interviews, listen to their pain points, hypothesize solutions, design user experience, iterate features multiple times, and see improvement in metrics. After running over 15 sprints, I was able to grow the product from a closed-beta product into a collaborative SaaS tool that over 7,000 teams use globally.

Role:
Product Management, Project Management, User Experience Design, Data Analysis, Data Science
Team:
Team Typed (Business Canvas)
Duration:
October 2021 - Present
Results:
- Grew team customers from 0 to 7,000+
Current Status:
Production

Background

Business Canvas was founded in July 2020, with the vision of solving the problems with information silos and fragmented workflow of information. As someone who has been in the shoes of a student, a researcher, and a consultant, I was often challenged when I was attempting to manage and organize my papers, reports, references, and more. These pieces of digital knowledge that I wanted to put to use was scattered everywhere and was difficult to utilize.

Typed, the product that Business Canvas was developing at the time was aiming to solve this challenge, so I decided to join the team and get on board.

Problem Identification

Typed is solving the challenge that every knowledge worker faces: information silos and fragmented workflow. An average person at work spends 2.5 hours per day just looking for work-related information, wasting countless hours just searching rather than getting the actual work done. With over 20% of their precious hours wasted, that’s like hiring five people and having one of them literally never show up to work. For startups in which every second, every resource matters, this is a serious problem.

Research should never be “re-searched”. A new way is needed to process digital knowledge, without noise or waste, to foster productivity, not burden it. With the aim of reducing information loss in the process of creating, communicating, and consuming information, Typed solves the problem of information inefficiency that happens within the workplace.

Customer Identification

Although the initial customer profile and solutions addressing their pain points have already been set by the time I joined the team, I still decided to redefine customer profile and conduct multiple ICP interviews to better understand their needs. Our initial ideal customer profile consisted of one common trait - a knowledge worker who has a research-heavy workflow. For example, a VC analyst who has to write a lot of research reports on various sectors and companies was our initial customer profile.


Some of the common pain points we have identified by interviewing 30+ researchers, analysts, and students were:
• They had a lot of tabs and windows open to keep track of references and resources they have found, and it's tedious to switch back and forth between their word processor app and the reference tabs
• References, files, PDFs, and documents are all buried in different folders and tools, and it's difficult to find them
• It's not easy to see connections among these files and documents created and added

To solve these pain points, Team Typed have created a document editor product where users can add any forms of references to their documents and view these references while writing documents seamlessly in one interface.

Activation

Once I joined the team, I decided to delve into user activation point. With the pain points and needs I identified earlier, it was clear to me that one key action that users have to take to be "activated" was to create a document themselves. However, when I looked into the user behavior data, the conversion going from clicking the document creation button to actually creating a document was very low.

The conversion rate from clicking the "Create Document" button to actually creating the document was below 40% for all new users at the time. This meant over 60% of the new users are churned without having the first experience of creating their first document.

UX Iteration - Activation

The as-is user flow of creating documents was as follows:

After analyzing the UX flow, I was able to point out the three points were potentially causing the churn:
• When users click "Add document", the page itself does not change immediately
• The document selection modal gives too many steps from choosing projects to typing the document title
• The button "Add document" is relatively weak - users want to create a doc, not just adding a doc

To address these three points above, I designed a new user flow of creating a document.

With the suggested UX flow, I communicated with our designer and engineers to finally change the actual UX of creating the first document. Below is the new flow when creating a first document:

Initial Wireframe
Latest Iteration

After changing the user experience of creating a new document, I measured the same conversion metric again. The new conversion rate of clicking the button to actually creating a document was 61%, which is a 20% jump from before.

Customer Re-identification

After proving our product-market fit as a single user productivity tool, we wanted to expand our position as a work collaboration tool. Our team believed pain points in information silos, scattered information, and fragmented workflow happen not only to individuals but also to teams.
Therefore, I wanted to re-identify our customer profile once again - this time, focusing on workflow of collaboration in a team space rather than an individual research and writing workflow.

This persona was based on 10 team interviews. We interviewed teams with various sizes, ranging from a small startup to a big corporation, and in different industries, from VCs to consulting.
We decided that early-stage startups who have low switching cost of using products and tools and are not yet locked into existing products.

User Flowchart

After brainstorming users' needs based on the pain points above, we decided to develop a feature regarding document review requests.

Based on our ideal customer profile, doc-writers in teams had the need of sharing their documents easily with their team members to ask them to review, read, or refer to their documents. However, with the current workflow of sharing documents on Slack or directly on Google Docs, our customers had difficulties tracking whether or not their colleagues have actually read or completed reviews on their documents.

To address this need, I designed the following user flow of an in-app feature of requesting a document review to colleagues.

With the flow chart above and the PRD I wrote, our design team created the initial prototype.

Design Iterations - Document Review Request Feature

Iteration 1 : Review Requesting Panel

Initially, the review request panel was where users could send out document reviews and view request status in one interface. However, the first few user tests we have conducted proved to me that putting previous reviews and the new review in one space can cause a lot of confusion. Especially because I had divided the request status into two buckets of "In progress" and "Completed", users were essentially exposed to three different statuses of reviews (new, in progress, completed).

I decided to divide the new review request and previous request status into two different tabs as follows: 

Initial Prototype
After Dividing Tabs
Iteration 2 : Request Purpose Tags

Our initial purpose of this feature was to reduce any unnecessary communications when sharing a document or requesting a review. More often than not, people at work have to spend extra time and effort writing greetings and additional words when they simply need to write that they are asking for a review. Our idea was to remove this extra communication and focus on the core review request.

With this philosophy, I wanted to create a set template for review-requesting users to utilize without having to add extra words.
I first created three "purposes" or types of review: "Refer", "Write", and "Read".

Initial Prototype
First Iteration

We initially designed the prototype so that users can choose whichever "purpose" of the review request they need. After a few user tests, we realized we needed to specify the purpose a little bit more since saying "write" or "refer" was not clear enough. Also, we wanted to create a set template text for each purpose so that users do not have to write a whole paragraph asking for a review. The First Iteration above is with the template text.

However, having the template text next to each "purpose" tag was misleading because users did not understand that it is part of the message that would be sent to the reviewers.

After moving the template text into a separate message box, I decided to re-work the purpose tags as well.
First, the names "refer", "write", and "read" seemed a bit too confusing. After testing out different names and notions of the verbs, we decided to go with "to refer", "must-read", and "comment".

Second Iteration
Third Iteration

I hypothesized "commenting" had the strongest notion of reviews since the reviewers have to actively leave comments, while "to refer" is the weakest one of all as users simply need to have a read of the document.

However, in most user tests, our users have pointed out the wording "must-read" seems to be more important as it puts a mandatory notion to the review. To reflect what the actual users have felt, I changed the level so that the "must-read" was the strongest request type.
In addition, a few users found it confusing that these three request types were not necessarily the actual request "messages" but the header of the section was "Request Message". I was able to make this part clearer by changing the heading to "Request Type".

Iteration 3 : Request Message Box

In terms of the actual request message, since I wanted to create a set template for the request, I first created a preset message.

First Iteration

The preset message of "Please have a read of this document" was paired to the "To refer" type of request, and each type had its own message. Users could delete the message and rewrite everything, but our intention was that users did not have to delete or add much to the message.

However, when we conducted user tests, most users we have tested deleted the default message completely and rewrote their message on their own words. I asked why they deleted the entire message, and they mentioned how default messages can be viewed as a "lazy move" when sending it to colleagues or their bosses.

Also, users were not able to understand how the message would have been sent. They were confused as to how the message would look and what information the message would include.

I decided to add a live message preview box that updates the message real-time as users type the message above. This box was intentionally designed so that users could understand what information would be sent out.

Second Iteration
Current Version

The second iteration still hadn't solved the problem of users deleting the default message. I realized it might serve its purpose better if the default message was created like a preset template.

Instead of having a default text for each Request type, I set the default message to be "Reason for request: Required for meeting". My intention was that users simply needed to rewrite the reason if necessary. Our team conducted more user tests using the latest version, and most users simply rewrote the reason or even clicked Send Request without changing the text.

Box

Iteration 3 : Request Status

Showing the request status is one of the key functions of this feature. We found it was essential to transparently show who has completed the review request.

Initial Prototype
First Iteration
Second Iteration

The initial prototype showed reviewers by dates when review requests were sent on. After redesigning our feature, we decided not to focus so much on the timeline aspect of the reviews. Also, putting the icons of users was not clear enough to show each reviewer.
The iterated version showed reviewers in a list view with a check circle in front of the icon. For requests that are not completed, we left the circle blank with dashed outline. However, during usability testing, some users tried to click onto the blank circles because the dashed line made the circle seem clickable. We replaced the dashed line with a solid line the iteration after that, making it clear that the circle is an indicator of the request status.

Product

The current product Typed is available, for free, on typed.do.

Other Features

Besides the features and UX improvement explained above, I have also worked on developing the following features and more:

PDF viewer and highlights
Document taskifying
Reference card design
Typed Mobile

Key Takeaways & Future Steps

My experience as a Product Manager at Typed has taught me not only how to lead a product team but also what makes a good product.
Especially throughout building the document review request feature, I had a chance to iterate details of the product over and over based on user tests and interviews. What makes a good product does not necessarily come from a brilliant idea or strong product sense; instead, it comes from how closely you listen to users and how obsessive you are towards users' needs and pain points.