Sunday, March 10, 2024

Digital Transformation, Really?

Judging by the solicitations I get from various vendors, "Digital Transformation" seems to be a big buzz word in the business circles.  I find this strange.  When the Internet and Web first took off, 20 to 25 years ago, I could see companies needing a "digital transformation".  But now?  Maybe there's some fringe businesses that still rely on paper, or very large legacy use cases but for any company that is less than 20 years old, what do they mean by trying to sell me digital transformation services? And what larger company has not been working on doing things "digitally" for the last 20+ years? Why is this term still being used? Is it just the consulting industry's ploy to sell us services?

Thursday, March 7, 2024

The Startup Equation

This is what I call the startup equation for staffing.  If time is "T" and work is "W", then:

 


At no point in time at any of the startups I worked at did we have enough people to do all the work we needed to be doing.  At first I complained, then I understood and now I accept it.  If you are at a startup and feel you need more people, understand that it is no one's fault and there is nothing to be done about it: it is just the way the math works.

Friday, February 5, 2021

Albers Projection for Maps

 I wrote this Python code a couple months ago when I needed a simple stand-alone way to plot cities on an SVG representation of a map.  There were no good, standalone Python references I could find.  Even when I found something in other languages, the readability of it was somewhat lacking.  

It seems the more math you know, the fewer characters you put in your variable names. Sigh.  Are they using meters or miles? Are they using radians or degrees? You have no idea from looking at the code, so you *have* to understand the math to be able to decipher it.

Developing this code required some investment of time in not just the math, but the geographic uses of the Albers projection. I figured it might be generally useful, so have made the code available in GitHub here:

https://github.com/cassandra/geo_maps

It includes an example that deals with some tricky issues that can come up if you want to plot points against a map of the U.S.  

When you see those U.S. maps where they have relocated Alaska and Hawaii to be in the lower left of the map, it is obvious they are not in the right location and are of different scale than the contiguous 48 states. However, did you know that they are using completely different projection parameters *and* have rotated them?  I learned this the hard way.


Thursday, February 4, 2021

Starting Over

I read a thought provoking article:

How to hire senior developers: Give them more autonomy

It contains the provocatively titled section "Your Code is Worthless", and it makes a good case for this being true.  The context here is not that the running code is worthless to the business, but that the code itself is of no value to anyone else. i.e., Worrying about a competitor stealing your code base is pointless.

A basis of the argument is that the software engineering process is not about creating an artifact (the code base), but about building a "theory".  Another way to express this idea that I have seen is: the software development process is a group exercise in the organization of information.  The code is nothing more that the way to capture the outcome of that collaborative process.

To reach the conclusion that the resulting code is worthless, the article borrows from work by Peter Naur (of BNF fame) which says it is impossible for software, or its documentation, to encode all the information that was collectively organized and which is necessary to efficiently maintain it.  

There are a lot of good supporting details in that article, so it is definitely worth reading. Many of the points are aligned with my experiences.  The one conclusion that got me thinking is the premise that it is cheaper and faster to start from scratch than to try to adopt an unfamiliar code base. I do agree that this is true for a competitor getting a hold of other company's code. But could this apply to the company that owns the code?

What if a company had 100% turn-over of their developers, is starting over the best choice for the company?

Engineers usually tend to think a rebuild is the right answer in this case, but the business side never does. Is the business perspective wrong?

I have been in a situations where there was 100% turnover in a complicated code base.  I pushed to keep some of the previous developers on contract because I knew how crucial it was to leverage the information in their brains. Even if a rewrite was the right answer, this was not going to happen instantaneously and the business needed to keep operating. 

So maybe it is this continuity aspect that changes the equation.  If you have no choice but to figure out the code base, then a rewrite is only an added cost, not a replacement cost.

Monday, February 1, 2021

Can We Please Keep it Simple?


There is a famous quote about software that I have heard attributed to Brian Kernighan:

"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."

I do not think you would find a software developer that disagrees with that statement, yet the field is overrun by a culture that seems to value succinctness more than readability.

When I began to write a lot of Perl, I wrote it like I wrote C programs and caught the ridicule of the Perl wizards who told me it was "better" to rewrite it as something that would wind up looking like this:

@P=split//,".URRUU\c8R";@d=split//,"\nrekcah xinU / lreP&&
close$_}%p;wait until$?;map{/^r/&&<$_>}%p;$_=$d[$q];

I continued to write the code simply with the fewest "Perl-isms" I could. Our hiring budget would only allowed us to hire mere mortals to help maintain it.

Ten year later, the pattern repeats itself. When I write Python, people point out all the extra syntax I do not need, even though it is my deliberate attempt to make things more explicit and readable. Optimizing character and line counts seems to be a reflex for many programmers.

I have grown to abhor this sentence:

"Look what I can do in just one line of code."

This is usually pitched to me in the context of someone trying to convince me of the wonders of the latest, greatest programming language.

Of all the languages I have learned, there have been only a handful of language advances I have encountered where truly "better" syntax was introduced in terms of readability.  Some of them include:

  • try-catch exception handling
  • "finally" blocks
  • "else" for for loops
  • list comprehensions (but only if used judiciously)
  • string interpolations

In general, if the syntactic feature is specific to a language, I will avoid using it.  I shift around writing code in many languages and the more I can do to minimize the context shift, the better.

With software, readability is 99% of the problem to be solved. As Martin Fowler has said:

"Any fool can write code that a computer can understand.  Good programmers write code that humans can understand."

I was inspired to share these thoughts after reading this article that also makes the case for keeping it simple:

 Simplistic programming is underrated

I have also recently read this related article:

Developers spend most of their time figuring the system out

This is definitely in line with my experiences and helps to emphasize how important code readability truly is.


Sunday, January 10, 2021

Collective Fictions of Software Methodologies


Software methodologies are a collective fiction. Necessary, but a fiction nonetheless.  The author of this article is where I first saw this concept:

My 20-Year Experience of Software Development Methodologies

My experiences and conclusions have been similar.  A software team needs something to organize them, but exactly what is used does not matter that much.  Really, all you need is:

  1. A system to organize the work
  2. A system to communicate.

All the software methodologies provide these elements. The bottom line is that any methodology can work with good people and every methodology will fail with bad people.  

As one of the cartoons in that article alludes to: the goal of a methodology is predominantly about avoiding chaos.

 

Sunday, December 20, 2020

Microservices - Why?

 


I read an article with the basic theme:

“Microservices was a good idea taken too far and applied too bluntly."

Link: Microservices — architecture nihilism in minimalism's clothes

I agree with a lot of what is in there as I have seen the complications this design pattern leads to and often wondered if it was worth it. 

This general behavior matches a familiar one that happens time and again. A similar things happened (is still happening?) with NoSQL and you could say the same thing:

 “NoSQL was a good idea taken too far and applied too bluntly."

Ditto for the semi-recent trend of "Single Page Webapps". XML was another one.  The list is broad, deep and goes further back than my time in the field.

Why is there the constant trend in software development of chasing the latest idea and overusing it until it collapses on itself? Haven't we seen this pattern enough to not repeat our mistakes? Ultimately, this is an immature behavior, but what is the source of that immaturity?  Is it simply because the field only goes back a few decades? Is it that the field is dominated by a younger and less experienced group of people?

Software engineers are initially drawn to the discovery aspect of technology, so they naturally gravitate toward the "new".  Human nature leans toward doing the familiar, often ignoring the "why". There is also the time required to learn a new technology that is a driver for getting the most out of that investment. Maybe these combine so that once the new thing is learned, it tends to propagate that new pattern and/or technology without asking "why".

Maybe it is an education problem.  Are we giving students these lessons and warnings about them in their Software Engineering classes? Wouldn't it be helpful if every student coming out of school knew about this danger and could recognize it as quickly as they can regurgitate the big-oh complexity of a bubble sort?

There's also a class culture in software where the "coolness" factor is related to the "newness" factor.  Many engineers look down upon the use of older technologies, ridiculing their use and sometimes shaming people into using new tech.  Who wants to be coding in PHP and be socially outcast from all the cool kids?  This behavior is especially troubling because it promotes the idea that the technology is more important than the problem it is meant to solve.

As an engineering leader, it is important to combat these less-than-rational reasons for adopting a technology. I think I am often viewed as a curmudgeon about adopting new technology, but asking "why" is the responsible thing to do.