Sunday, December 7, 2008

Measuring the Value of Testing

One of the best and most obvious ways to excel in any company is to show how your work is making the company money, or providing value that directly leads to making the company money. The more directly you can show your connection to how much money is being made, the better position you're in. For instance, if you can easily show that the feature you developed, or the feature you designed and brought to market, made the company $X over a time frame of Y years, you're doing well. If you can optimize the function such that X is large and Y is small, you're on a fast track to upper-management.

Showing this sort of result in testing is a much harder proposition. It's easy to say that you tested a feature or product that made $X over a time frame of Y years, but lets be honest here: being the product manager or lead developer and making the same statement definitely carries more weight when attached to a resume. The person generating the idea and the person originally implementing the idea will always be seen as being closer to the end result and positive cash flow than the person ensuring the value of the idea, and ensuring that the mass market can actually use your idea as expected. I'm not saying that this is "the way things should be", I'm just stating it's the way things are.

That being said, here's why I think demonstrating the value you've directly provided to a product is harder for testers: you can't show the diff between what the product would have made had the bugs you found been released to the public. That is an alternate reality we don't have access to; it's an A-B test that can't be run.
If it were possible to conclusively show that the company made $Z more because of the bugs that had been found, I think testing would be taken more seriously across the industry.

Friday, December 5, 2008

Testing in Layers

My old boss and mentor was chatting with me over Facebook recently about a concept that he termed, "testing in layers". It is the basic idea of slowly testing out a new release with people who are close to the code, then providing the release to people a little further away (friends/family/investors), then a little further (active members on your forums), while eventually moving out toward your normal everyday users. This idea struck me because it provided a simple mental model for actions that I think many in the testing community are already doing at a basic level, and provided much room for complexity if need be. It's nice to provide a concept around an existing process, so that it can be better codified and thought out.

In my world, I use this idea in the following context:
  • Test internally with engineering (QA)
  • Test internally with non-engineering (Marketing, etc)
  • Test by releasing to trusted users on our forums
  • Test by sending the release to particular people who email into Support, whose problems may be fixed with this release
  • Test by releasing to everyone on our forums
  • Test by releasing to a small number of people who download through the website
One of the issues with this idea is that a feedback loop needs to be in place at each layer. If we should release a build to the forums and don't listen to the complaints and issues that this layer is having, then this testing is pointless. However, the further out you get from the center, the harder it is to get solid feedback. When we release to a small number of people through the website, they probably don't know that they're getting a brand-spanking new release. In this case, how do we get their feedback? As of right now, we simply watch to see if they uninstall, and if the stars align, they might even leave us feedback. It's not perfect, and can use improvement, however so far it's a process that is working adequately to help us determine how the release is doing.

The idea of quality feedback loops for your product will be saved for another blog post. Lets just say that twitter has been proving quite useful in getting product feedback lately.

Thursday, December 4, 2008

When has Outlook "Started"?

I have recently had the opportunity to work with a small team to define and run some basic performance tests for Xobni. If you've done some serious performance tests before, I probably don't need to tell you what a peculiar beast performance testing can be. The fun part about this project is that we are measuring areas where are customers are having pain points, and building tools to automate the running of those measurements. On the other side of the coin, creating a consistent and controlled environment where performance measurements can be taken without fear that something external is affecting your measurements keeps me up at nights. No, not literally, but environmental control is one of our biggest problems.

One particular measurement which we've been stuggling with for some time is simply known as "Outlook Startup Time". How do you know when Outlook has fully, really, finally, finished starting? Most importantly, when do our users think that Outlook has really "started"? This is an important question for us: if we're going to improve Outlook startup time with Xobni installed, we have to know what that means. Well here are a few ideas for measuring Outlook startup that we've implemented in a tool of ours:
  • When the "Reading Pane" is visible and has text
  • When the Xobni sidebar appears AND the Reading Pane is visible
  • When the Application.Startup event fires in Outlook
  • When Outlook has finished syncing with Exchange
  • When you are able to move to the next mail and have it load within a certain period of time
  • When the CPU usage drops back to a level on-par with usage before you started Outlook
As you can see, we're measuring a lot and trying to see what sticks. What are your thoughts? When is Outlook usable, by your definition?

Thursday, May 29, 2008

A Different Kind of QA: Calling all Engineers

This is a repost of a blog post I made on the Xobni blog. You can read the original post here.


It’s common for people to ask why a good engineer like myself would want to work in QA, especially when you have to fight the stigma’s of QA, namely:

1) You are in QA because you are not good enough for development

2) You are in QA as a stepping stone for development

3) You are in QA because you don’t like coding

My response to those statements: pish-posh. While these statements may apply to some people in the field, they certainly don’t apply to the people serious about QA. A good QA Engineer solves quality problems with an algorithmic intensity that rivals traditional programmers. They are a true hacker in the older sense of the word - they are here to find and exploit the problems in the system in any way possible.

Every problem has its boundaries. For most developers, the boundaries for implementing solutions are usually confined to one language, stack, or technology. The boundaries for problem solving in QA are generally much wider, simply because our solutions don’t have to be productized, exposed to the public, and aren’t necessarily even in the same language or stack.

This allows a much wider range of creative freedom when solving problems. Learning new languages and technologies becomes essential for your work. Having a large arsenal of tools to attack a problem becomes a necessary part of the job. This provides you with even more of a reason to learn about the latest and greatest in tech, which is something that appeals to all engineers alike.

At Xobni we approach QA differently than most. The people we look for are not here because they are not good enough for development. They are not here because they don’t like coding. The QA people here are expected to be at the top of their game. They are expected to build and create software that can topple the Jenga-like building blocks of our product. They are expected to be creative people who like to learn, explore, and exploit software.

That being said, Xobni is looking for a QA engineer! Check out the job post, and send resumes to ryan dot gerard at xobni.com if you think you can rock our world.

Tuesday, March 11, 2008

Productivity and Flow

I've been thinking a lot recently about that state of mind where you lose track of time, focus intensely, and generally are very productive. This state of mind is not only associated with working; it can be found when exercising, playing games, and other associated focus-intensive activities, but generally I associate it with work, mostly because my best work is done in this state of mind. I find that if I can lose myself for a while and disassociate with reality while working, I generally come out of that state with an amazing amount of work done.

I was thinking recently that if I could induce this state of mind more frequently, I could become a more productive person. After a short amount of googling, I discovered that there is an entire area of research in psychology devoted to this subject; it's known as "flow". One of the main researchers in this field is a Hungarian psychology professor named Mihaly Csikszentmihalyi. He's published a few books on the subject, and the one I picked up is called "Flow: The psychology of optimal experience".

The book itself was a little too self-helpy for my tastes (if you look carefully, you'll see that the subtitle of the book above is "Steps toward enhancing the quality of life"), however I found tidbits inside could have been taken out of any of the software management books I've read.

After skimming through the bits on how and why you desire happiness, I found the core of the book: the elements of flow experiences.


1. Engaging in a Challenging Activity

He explains that the activity you're engaged in has to be at the edge between skill and anxiety. Even if your activity is complex, if you're too familiar with it, it won't be considered "challenging" to your psyche. You have to find something that is within your reach to learn or finish, but isn't easy.

2. Merging of Action and Awareness

Your attention is completely devoted to the activity, such that you have no awareness of the outside world. It is intense concentration, but seems effortless when deeply involved.

3. Clear Goals and Feedback

This is fairly self-explanatory, and it is also where I started seeing parallels in software development and management. On the development side, having small coding goals that are constantly achieved and iterated on is how I think many productive people program. On the management side, providing clear feedback and goals to your employees is a staple of good management.

4. Concentration on the task at hand

This is probably obvious, however what I found interesting was that he believes only a select range of information can be allowed into your awareness when in this state. Irrelevant information in your mental activity can break your concentration, and hence your flow.

Relating this to back to software environments, he goes on to state that quiet environments are essential to keeping your concentration. Much has been written already about how loud environments are productivity killers, and this just provides more evidence to that.

5. The Paradox of Control


He says that the flow experience is strongly associated with a sense of control. This resonates strongly with programming in my experience. One of the psychological benefits of programming (in my non-expert opinion) is the sense of mastery and control one gains over the system you're programming against. "Hacking" (in the Paul Graham sense, not the Kevin Mitnik sense) is merely another way of asserting your control and power over the system, by finding a non-obvious or faster solution to a specific problem. It's a very primal feeling that I think many, if not most of us, desire.

Mihaly then writes that the "paradox" of control "...is that it is not possible to feel a sense of control unless one is willing to give up the safety of protective routines". In essence, your sense of control comes by putting yourself into situations where you actually have less control, since the unknowns are much greater than in situations that you've experienced before. As he writes, "Only when a doubtful outcome is at stake...can a person really know whether they are in control".

6. The Loss of Self-Consciousness

Losing your sense of self-consciousness is a phenomena typically talked about in association with meditation, or zen-like activites. This loss is typically accompanied by, "a feeling of union with the environment". Projected onto programmers, this environment you feel a union with is typically whatever framework, system, or specific program you're working in.

Mihaly explains that what is temporarily lost is not the sense of self, but the concept of the self. High-performing violinists are very aware of their fingers, as runners are aware of their breathing. They haven't lost their sense of self, but the boundaries for how they define the self have temporarily vanished. This can be a very liberating experience, as "...the feeling that the boundaries of our own being have been pushed forward".

7. The Transformation of Time

It is normal to emerge from a flow experience and see that hours have passed without your awareness. What you're measuring when in this activity is not time, but states or milestones. When programming intensely, it's not uncommon to think of your progress not in terms of minutes and hours, but in terms of functions written, functionality working, and pieces integrated. Your world turns into a state-driven world, and not a time-driven one.


For the skeptical types (like me), I want to say that these elements are conclusions drawn from many studies of people experiencing flow in many different types of activities. While this doesn't mean that the conclusions are true, it does have more credibility than just some quack spouting off what he thinks brings about flow experiences.

Hopefully this has provided you with some thought-food to chew on regarding your own productivity. I think the main take-aways from this for me are that to really engage deeply in an activity, one needs:
  • A challenging task
  • A quiet environment
  • Clear Feedback (usually in the form of finished functions and functionality in what I'm writing)
  • A clear mind
  • Enough time set aside to engage deeply with the activity
I said earlier that I found parallels with this book and other software books I've read, and these take-aways prove it. These bullet points could be taken directly out of "Peopleware" or "Managing Humans", or any other book that deals with the topic of software productivity. It's always interesting to find parallels between different disciplines, and I think the psychology of programming to be particularly interesting.

Wednesday, January 23, 2008

The Great TODO List

Organizing and planning your work is tough. It's not hard to list everything you need to do, but prioritizing what you need to do can be an artform. For instance, do you focus on exploring a problem with the server (which is high-priority to your co-worker), editing a design document (which your boss wants done soon), or finding the root of that newest bug (which is really what I'm getting paid for)? There are definitely subtle tones of politics in these decisions, but I try to keep those matters out of my head when prioritizing.

I was discussing day-to-day planning tactics with some friends yesterday, and we found we all had different methods. I thought I'd share mine with you. I call it the Great TODO List. It's quite simple really. Anyone can start using this method immediately. The low-techness of it is astounding.

Step 1: Open Notepad
Step 2: Write down everything you need to do
Step 3: Put everything you need or want to do today at the top of the list

Amazing, isn't it?

Every day or so I go through the list and roughly prioritize what should be at the top that I may have forgotten about. As I go through the day, when I start to feel like I should be working on something else, I just consult the list, and pop from the top. Yes, the list is a stack.

The one downside to this method is that the list continually grows. I have stuff on my list from a few weeks back that I should still do at some point, but the likelihood of me doing that stuff is getting smaller and smaller by the day. The list needs love and pruning.