Saturday, March 29, 2014

Why change? Is your team aware of the value?

As change agents, we introduce changes in organizations we work for. Sometimes, these changes are integrated smoothly into the way the team works. In other cases, we find a lot of friction and resistance.

One of the things which facilitates change and alleviates friction is explaining the value of change for your team. Although this seems straight forward, but we usually do not exert enough effort to do that!

In one case, I didn't need to explain the value of change, although the change looked so naive and has no value. See what happened when I asked a crowd of 220 audience to change places, and they just did it with no single objection!!

This was part of my keynote last August at AgileAfrica 2013. I understand why would someone just follow a keynote speaker telling him to do something even he/she didn't know its value. If I were sitting at their place I would most probably think that there is value, and I would rather wait till I see it later on. 

I have done this experiment with smaller groups. The result was completely the opposite. They challenged this request (to change places). They felt skeptical about the value of this change. It was necessary to convince then with the value of change before they do it. 

Take care when you are leading change in organizations: In order to take the organization one step forward, you would better explain the value of change for the whole organization, or at least for the whole stakeholders who are affected by this change. 

Tuesday, March 18, 2014

Useful Statistics about Maintainability

Here are some useful statistics collected from several papers:

  • Approximately, 70% of software programs are identifiers. Therefore, identifiers names are of paramount importance for readability
Unused Features: From this paper:
  • Jim Johnson (the Standish Group) in his keynote at XP2012 reports that 45% of the features in the analyzed systems were never used
  • Industrial business information system showed that 28% of its features were never used
  • 25% of all method genealogies were never used
Code Clones and defects: Some very useful statistics from this paper: Do code clones matter?:
  • 52% of code clones are inconsistent
  • of these inconsistencies, 28% are introduced unintentionally, and 18% are faults in the system
  • Overall, 10% of code clones are defective
  • 9%-19% conformance of implementation to architecture, 72%-90% of these non-conformances are due to flaws in documentation
  • 10%-28% decay in architecture
  • 20% effort increase in maintenance due to code clones

Sunday, March 16, 2014

Duplication in Model Elements

Interesting idea. If you are using Model-Driven development, how would you know that you have duplicates in your models? are you aware that this may be a source of many issues? If yes, are you aware that you have duplicates? If yes, are you doing anything about it?

I came across a paper talking about this: Clone Detection in Automotive Model-Based Development. The author has implemented an algorithm to detect clones in models. It is interesting that the results of the study shows that 37% of the model elements duplicated at least once!

Model clones highlighted in red and light blue

Saturday, March 15, 2014

My Talk at Agile2013

After several months participating in Agile2013 with an experience report about refactoring legacy code, I find it very useful to write down things that I have learned, and my plans to benefit from this experience.

My talk at Nashville was very good al7amdulillah. the audience were very interested in the experiences that we've gone through. The first thing I've learned is that when you deliver a talk, you get better understanding about what you are talking about! There is a tradition which says: "المال تنقصه النفقة، والعلم يزكو بالإنفاق", or "expending money would decrease it, but expending knowledge would only increase it!"

The second thing I found is that teams really need a way out of the mazes of poor legacy code. I got two verbal and one written comment telling me that they are having so many problems with their code, and they felt that this roadmap of refactoring would be very applicable in their case. This was an indicator that I should continue working on more experiments and enhancing the roadmap more and more.

The results of the session evaluation was as follows:

Attendees: 50
  • 40 green (would recommend to other)
  • 6 yellow
  • 4 red
I had some interesting written testimonials as well:
  • "Amazing story of ingenuity and success! Excellent presentation"
  • "Very interesting talk. It gives me hope we can tackle our own mountain"
  • "Very good. Learned something we can apply"

One of the attendees gave me an interesting note that she missed half of the talk because of my english accent. This was an excellent feedback. In later sessions, I made a disclaimer that I'm talking "Egyptian English", which is similar to English but you need to concentrate more to translate as you hear :)

Another note is that data makes a difference with some audience. It just ticks with them when they see data. In a later talk about process increments, I got similar notes from other attendees. What I felt is that there are some types of attendees whom they start believe in what you say once they see some data!

Next Steps:
  1. I should continue the ongoing experiments and collect more data to correlate refactoring with business metrics to indicate real return on investment
  2. Conduct more experiments to collect more data. 
  3. Package the roadmap (along with rules or the game, the guidelines, tips and tricks, notes about tools), and give it a name.
  4. Enhance the tools, really don't know how, but I'm sure this will emerge in time. 

For your reference. This is the PowerPoint presentation of my talk, and this is the full published paper.