ArticleJune 2, 2022 · 6 min read time
What was the world like from a technology perspective in 2007 when Nitor, my current workplace, was set up? Many things have changed, but there are also some aspects that have stayed the same. Let’s take a walk down memory lane and see how the tech world has evolved since Nitor was founded.
At the time, I was working with software and system architecture at Tieto, juggling web services on application servers and portals to production.
In 2007, we witnessed the golden era of Java portals and the early signs of web front-end development technologies. Cloud computing was only just getting started. Local data centres and snowflake servers dominated the deployment targets. The IT industry was and still is balancing between different approaches; centralisation versus distribution.
What the IT scene looked like in 2007
The year 2007 was in many ways a good moment for Nitor to get started. Realism had set in after the dot-com tech bubble, and things were looking up again for experienced consultant architects and developers. There clearly was a demand for a consulting agency with a pragmatic approach to deliver on clients' business goals sustainably rather than living on the edge chasing the very latest experimental technologies.
Local data centres and snowflake servers (you know, each one a unique brittle creation) were still mainstream deployment targets for software. Serious workloads were deployed on mainframe computers such as the IBM System z9 running z/OS while server racks might have had nimble newcomers to the market such as the Sun SPARC Enterprise line of servers with the Solaris operating system.
While waiting too long for software builds to complete (something we still do today), we’d debate the pros and cons of RISC processor architectures like the Sun SPARC against the x86-64 CISC architecture which was and is still the norm for Linux servers, which operators would name affectionately and keep like pets.
It would be a few more years until cloud computing would catch on, although people with foresight could see the cloud coming. AWS EC2 virtual machines were first released in 2007!
Bootstrapping with Java
The above hodgepodge of processor architectures might sound like trouble for software development. Code would have to be compiled to run on each architecture separately, with each having its own intricacies. This was a non-issue, however, due to the industry’s favourite programming language, Java. The Java virtual machine ran (and still does!) on every relevant architecture and operating system. Write once, run anywhere!
Java was on version 6 back in 2007. Features like annotations had been introduced and developers felt an enormous relief configuring the increasingly popular Spring Framework 3.0 directly in code rather than a flow-breaking XML file that had more markup than configuration payload.
Java had good momentum going for it with the support of pretty much everyone else but Microsoft (it took Microsoft until 2021 to get back on the Java bandwagon after a tumultuous and litigious past). Java was certainly too big to fail, but in hindsight, it was also too big to evolve. It took almost five years for Java 7 to come out and many long-awaited features only surfaced in Java 8 in 2014.
While Java was free and open-source since late 2006, the money was being made with application servers and web portal platforms. These behemoths of the Java ecosystem were the public transport vehicles of service delivery. Everyone would cram their applications onto the same servers (after all, license costs were considerable!) and skilful operations personnel were needed to make sure everything coexisted peacefully within the same operating system process.
Some young mavericks at the time had started to head in the direction of smaller distributed service runtimes packaged as überjars which turned the tables on the app server business. Instead of deploying applications in a single runtime, the server components would be embedded in each application separately. It would take until 2011 before this became a mainstream line of thinking with Forrester declaring application servers dead.
The client-server balancing act
Taking a balcony view of the IT industry over the years, the take on centralisation versus distribution appears to alternate on a spectrum between the extremes, like a pendulum swinging back and forth. Before the web, we had native applications and thick client architecture.
Back in 2007 with the web firmly established, the thinking was firmly on the side of centralisation with app servers and portals and monolithic application architecture. Even web user interfaces were being written on the server-side with frameworks such as Apache Struts. Also on the server side, Google Web Toolkit and Apache Wicket were up and coming UI frameworks we had our eyes on but have since abandoned in our continuous search for frontend nirvana.
The Finnish Java web framework Vaadin was going through its formative years. It would take until 2009 for the framework to get its reindeer-inspired name, though.
Signs of client-side renaissance were starting to show in the midst of the server-side status quo. The story of Web 2.0 was shaping up and things had really started to gain momentum with the release of jQuery in 2006.
The real thick client disruptor was still in the works, however. Apple stimulated the mobile phone market as it entered the smartphone market in 2007 with the first iPhone. A year later they gave competitors in the mobile phone market a splash of blue ocean strategy with the introduction of the app store. Native mobile apps would soon challenge the web as an application platform.
Back to the future - Adapting to the swinging pendulum
Recent years have been a celebration of distributed cloud computing and self-service virtualized infrastructure. Things have improved in terms of being able to use the right tools for each job at a granular level.
Bottlenecked team structures related to centralized management of integration platforms, for example, are starting to change for the better with self-service platforms. However, the predominant microservices architectural style has also led to overcomplicated end-to-end solutions when microservices are built by default without consideration for the full software lifecycle.
Perhaps we’re now at peak microservice, and the pendulum will start to swing toward centralisation again? There are certainly signs of this visible. The monolithic application architecture from years ago has a new nickname “modular monolith”, which is the subject of seminal current works on architecture and has actually been used by experienced architects and developers all the time while the hype has been about microservices.
We’ll keep our eyes fixed on the pendulum as it swings back, and cherish all the experience we got from this swing to the distributed software maximum!
Nitor turns 15 in 2022. In this anniversary blog series, our experts look back at 2007, Nitor's founding year, and into the future trends in business, technology, design, culture, and agile methods.