What is the challenge? |
Interviews with customers include seeing screen actions, screen movements and choices customers are making that are not verbalized. We need these screen actions and movements captured in the transcript for analysis. Right now, we have a workaround by using an internal tool (built internally) for capturing the screen actions in transcript form. Then, we import that transcript into Aha Discovery. |
|---|---|
What is the impact? |
Understanding the actions our customers are making while using our product is very important for analysis. An interview that only captures verbal translation is missing the choices the customer is making when using the product that are not verbalized. If Aha Discovery could do this level of transcription, it would eliminate the double work for the team. Multiple product managers on the team said this would be "very valuable" if Aha Discovery could do this. |
Describe your idea |
Enable transcription in Aha Discovery to include screen movements and screen actions taken during a discovery interview to provide a complete and through analysis of the interview for product feedback. |
Hi, Do I reply to this email with those details?
Hi @Guest , thanks for the idea. I would love to know more about this. Can you describe more about how your AI tool works today? Are the screen actions captured in a video that you are analyzing? What does the output look like? Thanks!