The “cloud” — ubiquitous and accessible network, compute and storage. It has radically changed how we create software products and think about software engineering, in some quite profound ways.

When I started my career as a software engineer in the mid 90s we’d spend many weeks writing functional and user interface specifications, then we’d start coding, to create that cool new Windows or Mac application. Starting coding (in C/C++ typically) meant thinking about data-structures (LinkedLists/Maps), binary serialization, and then moving up to the model, view and controller tiers. Donald Knuth, Graphics Gems and MSDN CDs where never far away! Getting help typically meant asking a colleague. Inspiration came from Dr Dobbs magazine or similar. The build process was based on hard to maintain Make files, run on a build machine.

The work was extremely detailed, very error prone, and very time consuming. We rarely used third-party libraries and had a very detailed knowledge of the software stack, down to the machine code if necessary.

Getting code to run on the public Internet would have required weeks of time consuming form filling, provisioning, purchase order approvals, legal and security reviews.

In the last 10 years all that has been virtually swept away for application developers.

Today I can git clone some existing code and get it automatically building and deploying to the public Internet in a few clicks and less than 10 minutes. That code will usually glue together various libraries and frameworks that I can reference and import from a module system and registry. I have no idea where the compute, network and storage resources are coming from physically, I can just consume them as services.

This has freed us up to create quick prototypes, applications to test risky business ideas, and most importantly to create the rapid feedback loop with the customer that is required for agile development.

The modern development environment contains:

  • A module system and registry for publishing, documenting and reusing code, ideally using semantic versioning
  • A build system that works locally, but also runs on the server side on every check in
  • Powerful testing frameworks to define unit, system, performance and user interface tests
  • A plethora of libraries for creating REST APIs, web interfaces, cross-platform data structures, long-lived persistence, security, authentication, localization etc.
  • Languages and frameworks that can target multiple platforms, from mobile to mainframe, with everything in between
  • Easy (a couple of clicks) access to machines, network and storage required to run applications

For example, today one instantiation of this might be:

  • GitHub or GitHub Enterprise to store and document code, and track issues
  • Javascript and Node.js as the programming language and application framework
  • Express to build web centric applications (including REST APIs)
  • Mocha and Chai JS framework for defining tests
  • Travis for continuous integration (build servers)
  • JSDoc to generate HTML document from source code
  • npm as a module system and registry to reuse third-party code and publish libraries to upstream consumers
  • IBM Bluemix as a Platform as a Service to run Node.js applications

What is clear is that the whole is greater than the sum of the parts. These capabilities working in concert provide a very nimble and agile development environment that allows teams to explore new markets with a limited up front investment.