Interrobang.ai, my company that was focused on automating portions of project management, has gone bang pretty fast, in a bad way.
It's a lot to process, but I think the best way to explain it is through a Mark Twain quote that I love, "A man who carries a cat by the tail learns a lesson he can learn in no other way."
In short, I've always been aware of how it's possible to bias user research, but it wasn't until I really got into it that I learned my lesson by picking up a metaphorical cat by the tail. Essentially, what happened, was that I got people very excited for my project management ideas, and then I falsely interpreted that as validation of the ideas.
I had noticed that lots of technical program managers (TPMs) ask the same questions over and over, for example, "Is this on track?" If not, what are you doing about it?" etc, etc... I had invented a clever way to automate some of these questions and in a way that would be simpler for engineers to fill out.
This was what got people excited. Because asking engineers for updates is a pain, and it's easy to get program managers riled up when you prompt them about it. It's a button that's easy to push. I've since learned to identify that excitement as a false positive.
When it came down to actually testing my ideas, there was a fair bit of work to break the project down to be able to ask those automated questions in the first place. For teams that didn't have project managers to break down projects, (and where I thought I'd had the most traction with customers), this was a non-starter.
So the validation that I thought I had, was based on a false positive, meaning people who said they would or could use the solution.
Learning this, of course, felt terrible, but the good news was that I figured this out relatively quickly, and for very cheap. Slow and expensive would be worse by far. I've also learned how to identify false positives, and how to use testing to move past them. (The simplest rule of thumb is that user research is only valid for capturing past, historic behavior. Any hypothetical new, or future behaviors can only be validated by "skin in the game" testing.) The bad news, is well, that it felt terrible, and I didn't have the traction I thought I did.
The hard part now is deciding what to pivot to or if I even should. I know there's a market for project management tools, but it's an unbelievably cluttered space, and almost everyone in the space had the luxury of developing their tool internally before rolling it out. That includes Jira, Asana, Trello, Clickup, and others. I didn't want to build "yet another PM tool", but wanted to focus on a lot of the things I'd learned and to codify the process of using those tools. Sometimes different ways of doing things (like breaking down a project to take advantage of automated questions) work better as part of a consulting practice than as a standalone product.
Given the feedback I've gotten so far, I'm not sure I had the edge I thought I had, and I'm no longer sure Interrobang is a good idea as a business. (Distinguishing between good ideas and good businesses is often one of the toughest things to do.)
So for the time being, I'm putting Interrobang on hold. In the meantime, the world is my oyster as to what comes next. I'm still interested in Technical Project Management, helping teams and consulting has it's appeal, and there's an ice box of Awkward Engineer projects that I've been waiting to crack open.
Thank you to everyone for your support!
aka THE Awkward Engineer