Sunday, November 1, 2009

Sprint review meeting: It's all about Marketing!

I attend a lot of Sprint Review meetings, and sometimes I leave those meetings a bit sad...

It's not that the team hasn't done a great job! Quite the opposite. The teams I work with do miracles! But, like most engineers, they sell themselves short.

Sprint Review Meetings are all about Marketing! Your team needs to sell the work done during the Sprint to the audience!

Here are a few tips to effectively market your work:
  • Show the value of what you did: Don't explain how you did it, and don't go into excruciating technical detail about what you implemented... explain the benefits of what you did!
    • Do: "By improving the reticulating splines performance, customers have a much smoother checkout process"
    • Don't: "We improved reticulating splines speed by using a really smart b-tree structure"
  • Make a big fuss about what you did: Don't hide the great work you did in a bullet list with all the details of the Sprint. Pick 3 to 5 key points, and make a slide for each of them.
    • Do:
    • Don't:
  • Make a real world demo: Be smart with your seed data. Use real data whenever possible, or some really clever examples to make your demo more effective.
    • Do: "Peter Smith bought a Mega Chair and paid it with a Visa card"
    • Don't: "Customer A bought Product 1 and paid with card X"
  • Talk about the entire Use Case: Even if you don't demo it all, explain how the user got to the point of the demo and what the user is trying to achieve. It will make understanding your demo much easier.
    • Do: "Peter Smith was navigating at our web site to buy a chair. He did a search for chairs, and clicked the 1st result on the list. What you see here is the page present to him at that stage."
    • Don't: "Assume Customer 1 is at this page"
  • Make a clean demo: The demo should be as near to reality as possible. Avoid launching the command line, use batch files, or employ other kind of odd gizmos during your demo.
  • Don't dwell on what you didn't do: If your team made a commitment to this sprint that it wasn't able to keep, mention it, give a one sentence justification, and move along. Be prepared to answer any questions that may arise, but don't spend the time you have to talk about what you did, speaking about what you didn't do.
    What other tips would you add to this list? How do you turn your Sprint Review meetings into huge successes?

    Tuesday, October 27, 2009

    When is enough good enough?

    We all love perfect products! Those products that you actually feel pleasure just for using. And naturally, we all want to make our own products perfect!

    But perfection comes at a price. Generally speaking, the closer you want to get to perfection, the more it will cost you. And the cost increase isn't linear!


    (the smiling faces represent the mood of your "average user")

    This means you have to be careful when investing on perfection. In most scenarios, there's simply not enough time to make everything perfect. Nothing new here, right? It's just the old quantity vs. quality dilemma. You either do 1 perfect thing, or you do 2 "good enough" things.

    If you take this into a software product scope, there are a few guidelines that can help deciding how perfect your feature needs to be:
    1. How many times will the feature be used? Rarely used features can be a bit uncomfortable, and seldom used features can be good enough. Perfection should be saved for those features that are used all the time!
    2. Who will be using the feature? Features targeted at one particular user can get away with a hack. For features existing users are craving for, a quick "good enough" solution may be better than a delayed perfect solution. If you're aiming for first time users, you should go for perfection, to make sure they stick around for more.
    3. How unique is the feature? Is this something every other product does? Is it always done the same way? Unless this is really core to your product, you probably should stick to good enough and go with the crowd. But if this is something truly unique to your product, polish it up for perfection!
    4. Is the feature demo-able? If this is something you'll be wanting to show at a prospect demo or at a room with 400 people, going for perfection is a good investment.
    5. Is the feature sell-able? Is this one of those features that has the potential to enchant customers? Better yet, is this a feature customers will be talking about to your prospects? If so, by all means make it perfect!
    These are just a few things to consider. I bet there's lots more! What factors do you take into consideration when deciding how much effort you should invest on a feature?

    PS: The same principle can be applied to a bunch of other areas of a software product, like documentation and marketing. Anyone care to make a blog post about those? :-)

    Sunday, September 6, 2009

    The Secret of Agile Speed

    Here's an introductory video on how Agile improves your projects' speed.

    Friday, July 10, 2009

    Reading burndown charts

    Burndown charts are a fairly common tool used in Agile projects to measure the velocity of a team. It usually contains two sets of data, the actual missing effort and the expected missing effort.

    The thing is, looking at a burndown chart is useless, unless you have a very good understanding of what’s going on with the project.

    Here’s a sample of a burndown chart:


    At first glance, it seems things aren’t bad. Although there is a “saw” look to it, every couple of weeks things go back to normal. The drops seem to occur at the end of each iteration (assuming 2 week iterations). But the thing is, the chart doesn’t show us why the drops occur!

    One reason might be poor self management by the team. They only close the stories at the end of the iteration. This is a good scenario, because it means you can quickly coach the team to have more fine grained stories, so you can have better visibility on the project.

    Another reason for these drops might be that, at the end of each iteration, the team reviews the backlog and realizes it will miss the date. With that in mind, they remove the last items from the backlog, and the project gets back on track. This is way more problematic!

    The team is assuming the project is about 1/2 weeks late (marked by the dotted red line), so they cut 1/2 weeks worth of work from the backlog. But the reality is that the project is 5 weeks late! (marked by the dotted blue line).


    The consequence of this is that the team will have to cut about half of the backlog to finish the project on time! This can obviously have catastrophic implications for the project....

    There are several ways to deal this problem, but the important thing is that action is taken as soon as possible. And to be able to quickly understand what is going on, you cannot trust on the burndown chart alone. You need a very good understanding of what is going on with the project.

    Saturday, May 9, 2009

    Analyzing Movable Type search logs

    The OutSystems blog uses Movable Type. I was curious to see what type of searches were being made on the blog, so I downloaded the logs from MT's back-office and started thinking how I could analyze them.

    I wanted to generate CSV files, so I could open them in Excel or Numbers. At first I considered Python, because it has good CSV support. But most of the work would consist in parsing the MT log file, so then I considered Perl. The problem with Perl is that I can never remember the syntax, so in the end I decided to just use the Unix command line.

    I started out by checking unique sentences in the file:

    grep "Search: query for" logfile.csv | cut -d \' -f 2 | tr '[A-Z]' '[a-z]' | sort | uniq -c | sort -nr | sed 's/ *\([0-9]*\) \(.*\)/\2, \1/' > sentences.csv

    Here's what this does:
    1. grep "Search: query for" logfile.csv - Get all the lines from the log that are searches
    2. cut -d \' -f 2 - Extract the content of the search. This might not work if the content has ' on it, so be advised.
    3. tr '[A-Z]' '[a-z]' - Turn everything to lowercase.
    4. sort - Group the sentences (for uniq -c to work).
    5. uniq -c - Count the unique occurrences of each sentence.
    6. sort -nr - Sort by numbers, in reverse order.
    7. sed 's/ *\([0-9]*\) \(.*\)/\2, \1/' - Transform the result into a CSV file.
    8. > sentences.csv - Save to sentences.csv.
    Turns out this wasn't very useful, because there are a lot of different sentences. One of them kind of stand out, but it accounted for about 5% of the searches. So I've added a bit of awk to do the same for the words, instead of the sentences:

    grep "Search: query for" logfile.csv | cut -d \' -f 2 | tr '[A-Z]' '[a-z]' | awk '{for (i=1;i<=NF;i++) print $i}' | sort | uniq -c | sort -nr | sed 's/ *\([0-9]*\) \(.*\)/\2, \1/' > words.csv

    The awk script splits the sentences into words. This issue a more interesting result: 26% of the searches include the word "Agile"! Note that this counts words like "and" and "the", but it's easy enough to remove them.

    There's some more fun stuff you can do with this! Use Wordle to create a word cloud:


    Or you can check where the accesses come from. If you have an IP geolocation DB, you can try this:

    grep "Search: query for" $1 | cut -d , -f 2 | sort | uniq | awk '{print "SELECT country_name FROM ip_group_country where ip_start <= INET_ATON(\""$1"\") order by ip_start desc limit 1;"}' | mysql --skip-column-names -B -uGeoDBUser -pGeoDBPwd GeoDB | sort | uniq -c | sort -nr | sed 's/ *\([0-9]*\) \(.*\)/\2, \1/' > geo.csv

    You can then open the file and make a nice chart! Have fun!

    Wednesday, March 4, 2009

    Great employee care!

    I had a baby boy a few days ago. Because of that, I'm on leave from work. And, to my surprise, today I got a gift delivered to my house.

    It was a gift from the company I work for, OutSystems! It was a plant for my wife, a cigar for me, and a bunch of stuff for the baby, including some diapers, a blanket and some baby shoes.

    My company treats me pretty good on a regular basis, but for some reason this attention they had really hit a chord. It was a really thoughtful gesture, and I truly appreciate the gift!

    This is even more meaningful if you consider the tough times we're going through. To me, this sends a clear message that OutSystems is devoted to great employee care, even in the middle of the current crisis.

    Thank you OutSystems!

    Saturday, February 14, 2009

    Sharing a mouse and a keyboard between computers

    I have a Mac at home. Considering I spend most of my computer time @ work, the poor computer was feeling kind of lonely.

    Taking the Mac to work allows me to use a bunch of stuff that I really like, for task organization, to do some script development, and so on. But having to swap between keyboards and mouse is a pain.

    Along comes Synergy. It allows you to share a mouse and keyboard across computers and it works in a bunch of OSs, so I can have the server running on my windowz box, the client on the Mac, and share the mouse and keyboard between them. And it even allows you to copy paste text between the 2 boxes!

    Synergy works pretty good, and I'm using it every working day, but there are a few quirks that are kind of annoying. Here they are, so you can level your expectations on it:
    1. When the host machine is working hard (CPU or HDD) it stops working. This is a bit of a pain, because most of the time I want to go to the Mac when windowz is trashing.
    2. The keyboard layout for the Mac is a bit strange. You get Alt for Command, and the Windows key for Option. This means you use Alt-C to copy on the Mac and Ctrl-V to paste on Windows. It gets a bit confusing.
    3. It's not that tolerant to bandwidth ups and downs. Since my Wifi network at work sucks, this is a problem sometimes.
    Still, it is a pretty cool piece a software and well worth the try. And the price is right!