Saturday, November 28, 2009

Small Barriers == No Socialization

I was again reminded today that very small barriers can kill a conversation, or cause a contributor to stop contributing. I wanted to post a follow-up comment to David Chadwick's exploratory testing blog posting. I was on a different computer than I originally used to post the first comments.

Unfortunately, the IBM developerworks site is unfamiliar enough that I don't remember my password. I remember that it had a more restrictive password policy than other sites, so I was unable to apply my common password rules to their site. That made the site "unique" in a negative sense, something I had to remember outside my normal patterns of memorization.

The key point: I had a few minutes to contribute something to a conversation. I was not willing to spend more than a minute or two, including the time to login, post the comment, and verify the comment was posted. I abandoned the comment because:

  • Restrictive password rules were outside my typical rule set
  • Time was limited, I was unwilling to spend more time trying to login again
  • Prior experience with the site made me distrust its use of my time

The relationship between a contributor and the forum to which they are contributing seems to be very fragile, at least for me. The newer I am to a forum, the more fragile the relationship. The less value I perceive from the forum, the more fragile the relationship. Even seemingly insignificant hurdles may be enough to stop a contribution and disengage a contributor.

Thursday, November 26, 2009

Do Not Discard My Data

I just wasted 40 minutes trying to post a comment to an IBM developerworks blog posting from David Chadwick.

That wasted 40 minutes was a flawed attempt to describe the simple changes I think could be made to David's list of things which he believes exploratory testing is "not". I believe with relatively simple wording changes, I could modify each of his "not" descriptions to instead be descriptions of high value, useful exploratory tests. The wording changes are so small that I believe they hint David may not yet understand exploratory testing and how to apply it. However, this posting is not about exploratory testing, it is about a site that lost my data, twice.

My first comment was lost after 20 minutes of writing, thinking, and editing. The comment was lost because I clicked the "Add Comment" link, entered my comment, then pressed the "submit" button. The page which was returned politely informed me that my comment was rejected, and listed several possible reasons for the rejection. Unfortunately, IT DISCARDED MY DATA.

Don't discard my data! It frustrates me as a user and makes me unwilling to return to the site. If I must be logged in to submit a comment, force the login before accepting my input.

I registered on developerworks, logged in, and clicked the "add comment" link again. I added a short, dummy comment to assure that I was now able to add comments. I was able to add the comment.

I wrote a new response (I assume it was a little better than the first comment, since second drafts are commonly better than first drafts). After about 20 minutes of working on that response, I clicked the "submit" link. My comment was rejected again, with the same list of possible reasons for the rejection. Unfortunately, IT DISCARDED MY DATA AGAIN.

Don't discard my data! It makes me feel stupid, and then I need to remind myself that I'm not stupid, the software which should be working for me is instead making me do the work.

I assume the second failure was either due to my inserting URL's into the text, or due to the length of the text I was trying to post. I don't know which it is, and I'm frustrated enough with the developerworks site to not care which it is.

I gave up on trying to post a useful comment. I left a short note that my comments had been rejected twice, and if the author wanted my comments, he would need to send me e-mail.

Thursday, November 12, 2009

Systematic Source Formatting

When my team switched to Extreme Programming in March 2003 (XP by 3/03), we decided to try all the practices and all the recommendations. There were plenty of bumps and bruises as we learned what worked for us and what didn't work for us. We learned that the original XP descriptions were too light on testing for our business needs, and we learned that integration tests worked better for us than pure unit tests with mock objects. Those can be topics for another time.

One of the surprisingly effective results of that adoption of all the practices and all the recommendations was using automated source code formatting to assure all our code looked the same. We had a diverse team of people developing the code, some in the U.S., some in Germany. That diverse team of people had different opinions and attitudes about the "one true way" to format source code. Those different attitudes and opinions lead to annoying little changes from one file version to another as the files changed hands from one person to another.

Since XP espouses "no code ownership" and allows anyone to be anywhere in the code, those formatting differences tended to worsen as people moved through the source tree.

We took a "top down edict" approach to implementing "no code ownership". I decided (as the manager) that we would use a program to format all our Java source code as part of each compile. The concept was that developers would no longer need to think about formatting their code, a program would do it for them. We inserted the open source version of the "jalopy" source code formatter into our standard build process and then added an additional step to systematically format the code once a day with the same process, in case someone forgot to format before they committed changes to the master.

The initial adoption of the change caused some angst, since the chosen format (Sun Java code formatting guidelines) disagreed with the "one true way" which some of the team expected. We pushed our way through that initial hurdle and found over the course of years that automated source formatting freed us from a number of problems.

Positive results of our switch to automated source code formatting:

  • Code looks the same no matter who wrote it
  • Changes in the source master are "real changes", not white space shuffling
  • Developers don't need to think about source code format, they can insert code however they want and the formatter will fix it to meet the standard

Negative results of our switch to automated source code formatting:

  • A "diff wall" was created at the point where we first ran the automated source code formatting tool. The automated formatting created a single massive change to the source master to convert from the old ad-hoc format to the new consistent format. I worried at the time that the "diff wall" was going to be a real problem, but it never appeared as a problem in our work
  • Command and control style management is frequently a bad pattern and should be reserved for rare occasions (our major process change from waterfall to Extreme Programming was done as a manager dictate from above, and this was "hidden" in that process change)

There are newer tools to support automatic source code formatting (astyle seems to be popular) and I hope to persuade my new organization to adopt automated source code formatting. I'm not in a position to be the dictator in the new organization, so changes like that are more difficult to create.

Sunday, November 8, 2009

Deliberate Practice - Mary Poppendieck Talk

As a manager at PTC, I need to write a performance appraisal for each of my direct reports. I will also receive a performance appraisal from my manager to help me identify strengths and weaknesses and set goals for the coming year. While preparing my own self-evaluation, I remembered reading about the concept of "deliberate practice". I'd already been trying to apply my understanding of the idea, but decided to listen to an original source before including more about it in my self appraisal.

Mary Poppendieck's Agile 2009 deliberate practice talk describes deliberate practice as practice which is intentionally focused on improving performance. I want to improve my performance. Mary highlights four key components that are required for a person to be using deliberate practice:

  • Mentor - a high skills expert to review, critique, and highlight flaws

  • Challenge - tasks that require greater skill than we currently possess

  • Feedback - review and analysis of results used to improve future attempts

  • Dedication - hard work, time and energy applied diligently

I don't have a mentor for the things I'm trying to improve! At least, I haven't identified someone as my "coach" and had them agree to review, critique, and highlight flaws in my performance. Flaw number 1 (and the most glaring flaw) in my recent improvement attempts.

Many of my tasks and assignments are the same assignments I've had before, coordinating, planning, discussing, and interacting with others. However, I always find those tasks challenging because I don't feel like a naturally social person. Negotiating, discussing, persuading, and debating do not come naturally to me. There is room to improve in the "challenge" area, although I don't see it as making as much an improvement as the absence of a coach in my improvement efforts.

Most of my activities have feedback built into the activity in one form or another. Meetings which start on time, stay on task, complete their objectives, and end on time leave me feeling refreshed, invigorated, and useful. Meetings which start late, wander aimlessly, don't have objectives, or extend beyond their scheduled end leave me frustrated and edgy. That's a form of feedback. However, there are other feedback forms (like team retrospectives) which I've not been using faithfully lately, and need to start using again.

Dedication may actually need some "negative attention", since lately I've been spending too many hours at work and not enough hours with my wife and children. Mary's talk notes that expert performers in other fields (like music) have discovered that they cannot apply more than 3 hours of deliberate practice at a time because it is too tiring. They stop, change tasks, take naps, or otherwise refresh themselves rather than continuing, and risking developing bad habits by practicing poorly.

With the identified gaps in my efforts to improve, now it is time to choose a mentor and start hearing the performance critiques, then acting on them. Finding a mentor seems like a difficult task for a manager. I want a mentor that is

  • Regularly and naturally exposed to my performances, without requiring that I present them a summary of what I did. Swimming coaches do not ask the swimmer to describe the most recent swim, they watch the swimmer and then tell the swimmer directly and openly what could have been improved in that swim. My mentor needs to be someone who "watches my performances" on a regular basis as part of their normal work

  • Able to spend time and energy critiquing my performances with focus on improving them

  • Credible as an expert. My request for coaching is an act of trust and that extension of trust requires that I believe in the skills of the person providing the coaching. I may disagree with their perspectives, challenge their ideas, and still want their coaching

  • Interested in my success. Without interest in my success, I doubt the coach can be trusted to provide excellent feedback

I'm sure there are other things I need in a mentor as well, but that list already worries me. Those who are regularly involved in my work tend to be my direct reports, my peers, and my manager. My direct reports aren't very interested in coaching their manager (other than possibly to smile with him about his many failings). My manager is already a mentor by being my manager. My peers and I frequently disagree on methods and techniques and so I'm not sure peers are the best source of mentors either.

All this musing might also be less useful if instead of a mentor, I need to become a "buccaneer scholar" as suggested by James Bach. James suggests that I should take responsibility for my own education and for my own learning. That may make seeking a single perfect mentor a waste of effort, rather I could accept that there are mentors all around me and those mentors can provide useful information at times, and information to be ignored at other times.

Another alternative is to consider Bob Sutton's questioning of the value of annual performance appraisals. So many ideas to consider, so many things to learn (the act of writing this has already taken me on a different path than I expected when I began...).

Thursday, November 5, 2009

Was Our Switch to Git a Mistake?

My team is part of a larger, multi-team, multi-site organization in our company. The corporate choice for software configuration management is ClearCase. Unfortunately, my team is "remote", and "small". ClearCase does not handle remote teams well. I could find more polite ways to say it, but that is the simple result of our experiment. ClearCase was costing us far too much time due to its poor performance over a wide area network.

We solved the ClearCase problem by creating a bidirectional bridge between ClearCase and Subversion. My team's interface to the source master became Subversion. We saw great performance improvements, and were seeing the ClearCase updates within a few minutes of their arrival on the ClearCase master. The bridge was conceptually very simple, and it worked very, very well for our needs. Life was good.

We were then faced with a new, even more challenging problem. The new development work needed to be spread even more widely than the previous development work, with a larger number of teams involved and more dispersed geography. The new work would create multiple products, each with their own release cycle and their own development lifetime.

At the time we were making this transition, I'd been experimenting with the git version control system. Git is version control software created by Linus Torvalds when he needed to switch from BitKeeper for Linux kernel development. It runs very, very well on Linux. It is significantly faster than Subversion, which (in our environment) is significantly faster than ClearCase. Performance looked like a big winner.

Another challenge of the new environment was branch related. The new teams thought their development model would likely be "branch intensive". My prior experiences with branches have been with CVS and Perforce, where branches are globally visible and merging things between branches is a hassle. I hate branches. However, considering that the new world would be "branch intensive" and Subversion is generally not perceived favorably for branch management, we didn't want to use Subversion in a "branchy" environment.

With those two needs, distributed teams and branch intensive environment, we skipped Subversion and went looking. My recent git experience (and recent Mercurial experience) biased me in favor of a distributed version control system. A key opinion leader in the company had also been using git in a subteam of a very large project, pushing their results back to ClearCase. They reported positive results. My experience had also been positive while I was experimenting with taking a work project off on a "tangent". Git worked well for me, sitting on my underpowered Linux box doing my personal "skunk works" project.

We chose git as the team source control system.

Unfortunately, I had failed to detect my own biases, and the biases of the other early adopters of git. Those biases were very different than the biases of my co-workers.

I'm a command line fan. I'm old enough that my first high school experience programming computers was with the newly installed terminals to the school district mainframe (thanks Davis School District and Layton High for spending the money, the time, and the pain to install those machines!), then I moved to a University that required I submit programs on punch cards (makes me sound old). Before I left the University, they had upgraded to dumb terminals communicating with a DEC minicomputer.

As a command line fan, I found the git "user interface" perfectly comfortable and very similar to CVS, Subversion, and Perforce. There were a few surprises while I tried to understand distributed version control, but those surprises were related to version control concepts, not the specifics of git.

Unfortunately, many in my team and in other teams are not command line fans. They are accustomed to productivity accelerators like graphical user interfaces, integrated development environments, and mouse clicking to perform work much faster. The transition to git has been painful for them. In addition to my transition experience (centralized vs. distributed, new commands, new concepts), they've also had to deal with transitions from robust GUI tools (TortoiseSVN, Perforce Windows client, etc.) to weak and brittle GUI tools (GitSVN, gitk, git gui, etc.).

The challenge has been made worse by our decision as a management team to isolate teams on branches. Two of the managers in the team come from a large scale development organization (5-10x larger than our current organization) and they are accustomed to requiring branches as a way to isolate one team from the potentially breaking changes made by another team. The price of that branch isolation is that we now are required to perform more frequent merges of work, with the resulting complexity and frustrations which come from merging with conflicts. It gets worse when the files to be merged are coming from the Visual Studio IDE, and the meaning of the contents of the files is not always clear.

I think the branch configuration decision has done more damage than the choice of git, but that is probably biased (again) by my command line centric mindset. Unfortunately, we're far enough into the project that we aren't willing to switch SCM systems. We'll remain with git for at least the duration of this project, glad to have a source master, glad to have it connected to our continuous integration servers, and glad to not have the awful performance of remote ClearCase.

In all fairness to git, I still remember the growing pains when we switched from CVS to Perforce. I whined mightily at paying hundreds of dollars per developer for our corporate standard SCM system. Then I whined mightily at the tool changes and use model changes forced upon us by Perforce's way of thinking. After 6 months or a year, I discovered that I had changed my way of thinking, and was now very comfortable using Perforce, getting value from its way of branching, and being very grateful that it was so fast.

Maybe 6-12 months from now I'll say the same things about git. Maybe it is a part of "climbing the learning curve", and unfair to judge our experience this early. Or maybe not...

I still don't know what we should have chosen instead of git, since it is not clear to me that there were any better alternatives for my team at that time. The company was not willing to purchase another SCM system, since they were already paying for ClearCase. That excluded all the purchased SCM systems (Perforce, Microsoft Team System, Accurev, BitKeeper, etc.). The teams were known to be widely distributed, so that pushed us towards distributed SCM. The benchmark comparisons suggested that Git was faster than Mercurial in many operations, and the Bazaar people were still not settled on their final "on disc" format. Subversion was not well perceived for handling "branchy" development, and CVS was worse than Subversion.

The Linux kernel handles massive amounts of change (averaging 2-4 changes per hour continuously for the last 4 years) from many, many developers. It scales well for that widely distributed, branch intensive team, yet we're struggling with it. Of course, Linux kernel developers are even more likely to be command line biased than I am, and scaling the tool is not the issue that is getting in our way, it is more our choice to be "branchy" and the user interface weaknesses in git.

So many things to learn, so little time...