Sunday, January 20, 2013

TTT Curve - Economic Growth - and the Source Code [Cont]

In my previous post we skimmed through T-T-T curve especially for people who don't have relevant background.



source: http://www.materialsengineer.com/E-Steel%20Properties%20Overview.htm

Now, let's talk about its connection with the Economy and yes - the Source Code.



Year 1991 is marked as turning point in India's economic history, when the (the then) finance minister Dr. Manmohan Singh made many reforms, which created many opportunities. Thereafter India enjoyed a massive economic growth especially during the period of year 2004 to 2008. India's GDP (Gross Domestic Product) since 1991 has always stayed impressive, when developed countries had been stagnant.  However HDI (Human Development Index) of India has continuously regressed. I used to question myself "Is too much of economic growth good or bad?" I believe in Laffer Curve, which gave me the same answer that I derived the other way.

When economy grows, it changes many things. You have new factories producing products that people will consume. The same factories create jobs for people who in turn by these products. People as they create wealth, buy houses and new cars. They migrate to cities. City cannot accommodate such a large rise in population, so we build multi-story towers which again gives business to construction companies which generates more money & employment. People go to shopping malls, multiplex theatres & restaurants to spend the money and so on... A goody goody picture painted by Capitalism.

But the picture is definitely no that rosy. All the things above have created severe Environmental Debt. I completely agree that growth has to occur. There is no progress without economic growth. However, when growth happens, the system has to change its Internal Structure. Widening the roads & increasing allowable height of the buildings does not solve the problem, rather it complicates the problems or creates ones. The city needs to be re-built with new design, infrastructure, schools, gardens, hospitals and so on - towards its new needs. But that has not happened. Capitalism has always painted the illusion that system takes care of itself. The strong force within individuals to make more money & resources actually helps the whole society to improve. To some extend its true - but only to some extend.

What makes difference here is the sheer Rate of Change. Look at TTT curve - there is something called as Critical Rate of Change. Very high rate of change (e.g. rapid economic growth) is required to achieve higher GDP, but it also creates a lot of internal stresses & distortion. Some times we cannot control rate of change - we are mercy to it. But we can certainly control "structure of the economy" (i.e. grain structure) and "Government Policies" (i.e. alloying elements) that will shift the TTT curve to right side of x-axis.

What happened to India during the period of year 2004-2008, has happened to my beloved city of Pune where I live. Historically this city has been hub for numerous  research & educational institutes like "National Institute of Virology", "India Meteorological Department", "Bhandarkar Institute for Oriental Research" and so on. However growth of city in last decade has been based only on Consumption, rather than Creation. We didn't add any more premier institutes, what we added was only things like shopping malls & restaurants.


The same thing has happened to many software companies. I know a few software companies which were small in size, but were serving a very niche market. They had great people working in a very open & professional environment nourishing high value system. Needless to mention they were making good money & wee also contributing to society. Working with those few companies was considered as a symbol of pride. During year 2004-2008, the market environment grew very rapidly. Investors & stock market pressurized companies to show every quarter to supersede market expectations. Supply of people was short. To make the billing happen & tick the business, company started recruiting in tons of people without really checking do they really fit into company's values & culture. Recruitment heads of these companies were given the "Headcount Targets"! This encouraged mediocracy & companies lost their reputation. They are somehow sustaining the growth, but they lost their much preserved organization culture. It is the sheer rate of change without proper grain structure & without alloying elements that killed those companies.

Same thing happened to the source code. It was eighties, when Fred Brooks wrote the famous & beautiful book called "The Mythical Man-Month". But it seems we are still are doing the opposite. Expected turn-around time for projects to deliver has shrunk tremendously. To achieve that, managers put lot of developers & testers on the project without smooth ramp-up. There are frequent governance reviews, so frequent  -that team members spend more time preparing for reviews than actually writing code or testing it. Projects get delayed & management once again puts more people in a hope to bring it on track. Quality of code degrades rapidly & the project never gets over.

Expected turn-around time is going to shrink further. We are going to get developers & testers who may not have required competencies. In fact gap in competencies is going to be very high.  And we may not afford to be very choosy in choosing the people, as pressure from investors & stock market will be tremendous to short QoQ results. This is fact of life, which we have accept. When economy grows fast companies are always short in supply of skilled people. We are not talking about niche companies here, but general market here - especially in context of common Indian Software Companies.

Question is; can we still make investors happy without sacrificing internal quality of code & without diluting organization structure? We need to think differently. Effect of grain size & proper alloying elements help a lot. In terms of code, fine grain structure (i.e. highly cohesive, loosely coupled software components written with Single Responsibility Principle) is able to withstand high rate of churn. The best Alloying Element is "Testability and Tests". It is often difficult for companies who recruit in masses, to teach new-comers about writing good code, software design, Design Principles, Design Patterns , Programming Paradigms, usability & rapidly changing technologies & so on - in short span of time. But is we teach people, work with them & show them how to write a testable code, many other things fall in place automatically. Testability is the way to scale. Its the best alloying element.

I learnt the magic of testability when I visited  GTAC-2010 ("Google Test Automation Conference") a couple of years back. The talk by Russ & Tracy "Flexible Design, Testable Design, You don't have to choose" really provoked me a lot. I went back to my environment. I applied to multiple code bases with projects from different challenges, its amazing to see how simple it is! It really works!

Thursday, January 17, 2013

TTT Curve - Economic Growth - and the Source Code

In last few posts, we were discussing about interesting parallels between working with metals & the code.We discussed about properties of metal (& the code) like Strength, Ductility & Malleability and  Toughness. We also talked about Case Hardening. In this post, we will discuss something more interesting - i.e. Time-Temperature-Transformation (TTT) Curve & its relation to software development.


The abstract in the above site mentions:
The traditional route to high strength in steels is by quenching to form Martensite that is subsequently reheated or tempered, at an intermediate temperature, increasing the toughness of the steel without too great a loss in strength. The ability of steel to form Martensite on quenching is referred to as the hardenability. Therefore, for the optimum development of strength, steel must be first fully converted to Martensite. To achieve this, the steel must be quenched at a rate sufficiently rapid to avoid the decomposition of Austenite during cooling to other stable forms like Ferrite, Pearlite and Bainite.
In this post we are going to discuss out Hardenability. Ability of the steel to form Martensite is called Hardenability. It can be learned plotting the Time-Temperature-Transformation (TTT) curve. Steel is heated above its Austenizing temperature & cooled down at different rates. Curve is plotted where structural transformation of the steel into more stable forms begins & ends. This results in C-shaped curve.

What does the C-shaped curve signify?

The position of "C" along the x-axis leaves a narrow neck. Steel can be converted into fully Martensite - if & only if - its cooling rate escapes the "C" region of the TTT curve. The cooling rate that's tangent to the "C" is called "Critical Cooling Rate". It is the minimum rate at which the steel must be quenched in order to get fully Martensite. If steel is cooled slower than this critical rate, then it results in other structural forms like Bainite, Pearlite, which will not give required hardness.

So what's the issue? Its possible to ensure that steel is cooled rapidly enough to have desired hardness. But the negative side effect is, rapid cooling leaves undesirable internal stresses into steel which makes it brittle. Also, rate of cooling is so rapid that some of the Austenite doesn't at all get a chance to get converted into Martensite, even if its possible. This leaves some Retained Austenite, which makes properties of steel unpredictable.

Addition of Alloying elements into Steel play a very important role here. All alloying elements except Cobalt shifts the C-curve of TTTdiagram towards right side of the x-axis. This lowers the Critical Cooling Rate, which means you can afford to cool the steel a bit slowly, while completely converting to Martensite to get desired Mechanical Properties for the steel component you are making.

I have added nothing new so far. Every Mechanical Engineer knows it & every Metallurgist breaths it. If anyone notices errors in the above technical content, appreciate if you can bring it to my notice. 

Now some other interesting thing starts here. What is the relation of this post to Economic Growth and Software Development and the Source Code
Well I'll put it in my next post, as this one is getting longer.

Working with Metals & the Source Code - III

Now, let's talk about something more interesting - Case Hardening.

Hardness is an extremely desirable property of most mechanical moving components. If gears of transmission system of your car are not hard, then they will wear out quickly. Making the steel hard; is not that hard. Increasing amount of carbon & other alloying elements increase the hardness, as well as its grain structure which can be controlled by proper heat treatment.

However Hardness comes with a gift - Brittleness. Typically hard components have low toughness. They fail in brittle manner, without giving proper symptoms before failing. We cannot afford to break a gear!

So what we do? One who invented Case Hardening must be very innovative. We combine high-carbon hard material with low-carbon ductile one into a single component! There are various ways to do it - one is carburizing. Heat the steel component in a furnace that has controlled high carbon environment. Material absorbs the carbon, but the depth of its penetration is very carefully controlled. As a consequence; component becomes harder only at its surface up to certain case depth, while its core remains soft.

What does it mean?

When a case hardened component is subjected to impact, it initiates a crack - as the case is brittle. However the soft core doesn't allow the crack to propagate. That helps component not to fail.

Why are we talking about this, which everyone knows already? Well, just want to share an interesting analogy, though not a perfect one. Mechanical components require hardness as they need Wear Resistance. Software doesn't wear after use, but the other part of crack propagation is important.

When certain component (say library or class or function etc) fail due to unanticipated impact force such as Bad Input, we don't want that crack to propagate throughout the software. We want our software to detect & stop it. Earlier it detects, easier it can handle it. Even if it fails, it fails elegantly giving useful information to the users & developers of the software.

Yes, we are talking about Assertions.
If I have to say two most powerful tools I use in writing code, then they are (1) Commit Early, Commit Often AND (2) ASSERTs.

Asserts detect the crack right at the point where it is formed. It fails fast. This tool is useful not only while writing code, but also in personal life. After all, code reflects personality of the developer, it cannot be separated. Well, this is another interesting thing that we will talk when we see in person.

Working with Metals & the Source Code - II

In my previous post I started sharing some interesting parallels between working with metals & the code. In particular we talked about Ductility Malleability.

In this post, let's talk about:
  • Few other very important properties called Strength, Toughness, hardness AND
  • What is the effect of Grain Size on these properties.

Of course these matter to the source code as well, as it matters to metals.
Strength is something obvious to imagine as its very important property. Its ability to withstand force without fracture or undesirable deformation. The strength can be  against Tensile or Compressive forces. It can be against Impact (Toughness) or cyclic forces (Fatigue).

Once we ship the software we have no control on how it will be used!

We can of course provide an elaborate documentation which no one will read. When did you last time bothered to open the user's guide of the digital camera you purchased? When you can't make out how to alter shutter speed, you play with various controls & try it out. It may work or it may not. If it doesn't you take help from your friend or dial the customer support. When nothing works, you open user guide.

That's natural! Point is, we cannot predict how the interfaces of the software will be used, in what context, in what sequence they will be used, whether it will get passed correct data or what. We cannot predict network speed at users end. If its a client-server application, we don't know how many users will try to connect it simultaneously. We can anticipate & design for that; while making some assumptions. If software crashes or corrupts user's data; it's very embarrassing & shameful for us, no matter whether or not ne used it as prescribed.

Yes, we are talking about Strength - Tensile, Compressive, Impact, Fatigue & all. In order to be sure we are shipping a robust software, we test it rigorously inside out labs as well as with beta users or some innovative ways like crowd sourcing. Once tested, we are very careful about changing the source code. In fact many teams discourage modifying the source code after testing before release. They call it Code Freeze & with some teams I have seen it happening months before release. I cannot understand "Code Freeze" - how code can be freezed? It must flow :-). But well, I understand the motivation.
Here we take help from "Open Closed Principle" (OCP), where we close the code modification, but open to extend it. That enables us to implement users requests & bug fixes. I'm fan of OCP, but its overuse has negative side effects. Too many extensions creates too many Grain Boundaries. Yes, Let's talk about grain boundaries now...

Most real materials are Poly-crystalline. Roughly crystal is how atoms or molecules in the metal are arranged together for form a Unit. Each individual crystal in poly-crystalline material is known as Grain. Regions where grains meet is known as grain boundaries.

Material with arrangement of large individual grain size have fewer grain boundaries. Its called as coarse-grain structure. A fine-grain structure is the one which contains smaller grains resulting in more grain boundaries.

Fig 1: Coarse & Fine grain structure. source: www.sciencedirect.com

What does the Grain Size and Grain boundaries mean to the software and the source code?


One can relate Grain to individual library or a package or a class or a function etc - let's say individual components of a system. If you design component with clear focus & crisp responsibilities i.e. Single Responsibility Principle,  it results in fine-grained structure. Fine grain structure has better mechanical properties like Strength, Toughness, Hardness compared to that of Course-grain structure.

Fig 2: source: http://www.southampton.ac.uk/~engmats/xtal/deformation/control.htm


However there is one problem with fine-grain structure.


Grain boundaries act as barriers to slip. This is because they have random crystallographic orientations, which means that the slip systems of adjacent grains are not aligned. When a dislocation reaches the grain boundary it must either change direction or stop moving. Changing the direction of dislocation movement uses up energy so the plastic deformation of the material becomes more difficult.


That's good! But Oh, that's not good as well!! We need to have our code fine-grained so that it has required strength, toughness & hardness, but at the same time we also want to be able to shape it when we want!
Well, large number of grain boundaries does not necessarily mean that code cannot be changed. As explained in above figure boundaries arranged in haphazard way prevent relative movement of grains. But if they are aligned, they can support the movement.

In deed, grain boundaries represent interaction between individual components of software. A badly designed software typically means badly arranged grains. There is always a "desired fine-grained structure" of the code and "undesirable fine-grain structure". The desired ones are Highly Cohesive & Loosely Coupled systems. They happen when one designs with "Open Closed Principle", "Single Responsibility Principle" & design-by-contract. Such a system has high strength, toughness & hardness, at the same time they are easy to change as well!

How "undesirable fine-grain systems" are formed? This typically happens when we have a big team formed rapid ramp-up, writing code without tests & version control discipline. Every developer starts writing code with nes half-cooked understanding of the system. There is no re-usability. In fact there is no usability in first place :-) as different components badly interface with each other, they are tightly coupled & no cohesion. No wonder such systems are brittle in nature. Compare this with the molten metal that is  cooled down very rapidly. The rapid cooling rate results in many centers of nucleation of grains which grow rapidly & meet fellow grains to form boundaries. If the material is thick, then the rate of cooling is also uneven all over the places leading to unpredictable properties of the metal.

But don't worry, they can be cured. You only need to put them into Refactoring furnace, heat it above critical temperature where grain structure gets corrected & then cool it down at controlled rate. But you need to fuel to fire your refactoring furnace, how you will do that? Yes, you guessed right, that's Testability. Build testability into your code, and most other things will fall in place. Watch this video: Flexible Design? Testable Design? You don't have to choose! - by Russ & Tracy. Its amazing, it works! It has worked for me! Wish you all the best!

Wednesday, January 16, 2013

Working with Metals & the Source Code

I'm a Mechanical Engineer & have always loved studying Metallurgy during my studies & profession. I'm writing software for Engineering Applications for about an decade now. There are interesting parallels between working with code & the metals. Computer scientist drew many inspirations from Metallurgy in the form of algorithms like Simulated Annealing, however this post is more about some simple stuff.

Here are a few most important properties of metals:

  • Strength --> Maximum stress a metal can withstand before fracture

  • Ductility --> Ability of a material to deform under tensile load
  • Hardness --> Ability to withstand surface indentation
  • Fracture Toughness --> Impact Energy absorbed by unit area before fracture



Strength matters to the code as well. Just like metals, the code is subjected to various forces. Forces of changing user requirements, changing hardware & software environment, adopting to different third party environment & so on. Not to mention, the pressure from investors to deliver QoQ profits and managers trying to squeeze everything in tiny budget & time!

Here is how the Stress-Strain curve looks like: (taken from Wikipedia)



What happens when you want to stretch a thin wire out of a metal - say Aluminium or Steel?


As the metal gets stretched (& so the code), it starts developing internal stress. The original design (Lattice structure) though gets deformed & strained, is able to absorb the force of changing requirements & environments.

As we stretch along, it becomes harder & harder to stretch it further. This phenomenon is called Strain Hardening. Software was not designed to accommodate unanticipated changing conditions. The grain boundaries between various interacting components resist the ductility.

A point is reached when its extremely hard to work with it & it starts breaking. A ductile code shows early signals before failure, while the brittle code & the metals fail abruptly!

So what do you do? We need wire to be stretched & sheets to be hammered. 
Our investors want to continue to make money from the software & that's how we get our salary cheque! Customers want us to produce more usable features & our managers want all these in the modest budget!  

Yes, there is a way to go forward - Heat Treatment! We put the metal into a furnace, heat it & cool down at a controlled slow rate. If its a steel, we heat it something beyond called as "Austenitizing Temperature". This is the temperature at which steel transforms into "Austenite" - from  body-centred cubic (BCC)  to face-centred cubic (FCC). With this new design of internal structure, it is further able to withstand the tensile forces without getting fractured.

Can we do this with code? Can we change its internal structure so that it can be further worked upon for some more time? I wish I could put the code into some furnace & heat it!
Indeed there exists such a furnace - that's Refactoring Furnace!

Malleability of steel & its ability to get heat treated depends mainly on its constituents called "Alloying elements". Steel is primarily made of Iron & Carbon, but it also contains alloying elements like Mn, Ni.

Our Refatoring Furnace can heat treat our code & improve its internal structure so that it can be stretched & hammered further. But its not possible without adding alloying element. We need some alloying element that would make easy to heat treat the code. That alloy is called "Testability & Tests"!

Your code is working, but do you think its badly designed? Do you have fear to change it further? And are you getting dragged because of enormous amount of Technical Debt carried over? Don't loose heart! You can still refactor it. You just need to add this alloying element called "Testability". Thanks to "Michael Feathers" for his beautiful book "Working Effectively with Legacy Code".