”Technologies are useless if the construction industry doesn’t recognize the burden of its old ways of working and develop new practices.” (Taina Eriksson, DigiPro research, Confederation of Finnish Construction Industries RT)
For years now, the whole world has been making a fuss about the megatrend named digitalization. As professionals in the construction industry, we’re still waiting for the salvation, while as consumers we get to enjoy the best fruits of the digitalization: the supreme digital services produced with the immeasurable digital development resources of the world’s most valuable companies. One of the best known examples for this is of course Über, which spectacularly and predominantly disrupted an entire, traditional industry.
So what’s the difference between ”digitizing” and ”digitalizing”? We’ll get a concrete answer by looking at what happened in Finland under the Über threat. The taxi industry was forced to ”do something” and the best it could do was putting the 20th century practices into binaries, that is, the processes based on cab booking services and manual labor stayed, and on top of that was added a layer of bits. As a result, you get just one new customer channel, a mobile application that allows for a consumer to order a cab. But what’s new about that? Not so much, because the consumer has been able to order that cab for decades using other channels.
Digitizing basically means forcing automatic data processing on the old, tired and ineffective practices from the 20th century. Digitalization, then, means re-invention, solving problems in a new way that genuinely transforms the traditional value chains. The productivity of construction will not go up with a mere silver bullet named digitalization, but it does play a relevant part in the holistic system of tomorrow, where the other parts are modular prefabrication, and takt production.
From processing data to developing the process and how it’s lead
We all know that the construction industry struggles with poor development in productivity and the standardization of new ways of working, overall. New development is done all the time and, say, in the application development a whole lot is going on, but the core problem remains: the different parties of the building ecosystem, that is, main contractors, subcontractors, end clients, designers, project developers, etc. lack a common set of concepts, a common language. It is only through developing this ”lingua franca” of construction that allows us to move forward from the partial optimization and mess of the dozens of various, isolated partial processes.
In other industries, this common language is already a reality in the form of shared industry standards. In construction, we still have some way to go, but luckily we’re already on the journey. Many industry parties are actively collaborating with the global organization for standardization, GS1, and Fira Smart is no exception. We need to first reach an agreement on the shared concepts and shared language before we can start digitalizing the cross-company processes and value chains, such as material logistics and the related delivery chain management, just to mention some examples.
Carinafour Oy chairman of the board, Ari Viitanen recently summarized the problems in the digitalization of the construction industry quite well: ”The biggest problem in making use of digitalization in the construction industry is the lack of standardized processes. There’s no use in digitalizing chaos.” I couldn’t agree more. Another ”industry”, that is, data utilization and analytics, similarly suffers from an intrinsic problem: 99% of the talking and doing seems to be around the last mile, the ”data utilization”. However, the more important phase, that is, the phase where the data is created, is left without interest. But it is exactly there, in the moment the data is created, in the daily business operations, where also the data quality is formed. The data leaves the ”factory” either directly in a suitable shape or then it doesn’t. It’s a choice. And this choice is made by the director who sits on top of the business in question. Who else?
The principles of quality management say that the repair costs caused by quality defects grow exponentially, the longer you go in the delivery chain from the ”factory” towards the client. At the factory, the repair costs are 1x, in the next phase 10x, then 100x, then 1000x and then who knows how high. I don’t believe anyone gets applause, for example, when a car manufacturer must pull dozens of thousands of cars back from the customers to be fixed at the factory.
So my question is: Why an earth do we do the opposite with data, the most important intangible asset of the 21st century? We have highly-paid experts ”process” and ”fix up” the data right before it’s used, for example, shown in dashboards or, say, utilized in an AI solution. That is exactly where the cost of the quality repair is the highest! Every organization that does this should ask their CEO what he or she feels about this practice. What do you think the answer will be – do you want the costs 1x or 10 000x? Exactly.
”Processing data” is not a phase that produces value, quite the contrary, it is a cost. Therefore, we should aim at processing data (=purifying,polishing,cutting&pasting) as little as possible. We should put the data in shape already at the ”factory”, just as they teach in universities for principles of quality management, it should happen in the everyday business operations. The artificial ”improvement phases” that take place ofter creating the data represent 100% waste, work that produces no value. This should be eliminated, and definitely not increased. How do we do that? By actively influencing the moment in time where the data is created. It’s influenced by something called ”management” and this management takes place in the daily operations, ultimately on the responsibility of the business directors or process owners.
From looking for data to looking at data
After fixing the data production process, it makes sense to start talking about fully utilizing the data. Then, we don’t have to figth the users on the data quality issues, but instead, this is properly management at a daily level. For this very reason, we have worked on developing the construction processes for years already. Today, we make use of a number of digital tools at construction sites, including schedule and quality management and external condition management. One great example of these is the concrete case of the digital daily work at Postipuisto.
Once the daily operations produce high-quality and up-to-date, and most of all, reliable data, it’s easy to build on that digital solutions like site-office dashboards to support data-driven management. The value proposition of such solutions came literally from its client, as a responsible site manager put it perfectly: ”From looking for data to looking at data”.
But we can only reach this when the daily management is done well, and the operations that result from it produce high-quality and reliable data. Only then does it make sense to set up digitalization and other tools to serve and form this most important intangible asset of the century, to match the demand.
About the author
Tommi Vihervaara (Master of Science, Information Technology) is a passionate professional on data delivery chain management and digital solution development. His business-oriented and solution-seeking way of working has taken his career from software development to consulting in data analytics to data management, for example, in the aviation industry at Finavia. At the moment, he’s focusing on progressing the systemic change in the construction industry via data at the modern construction company Fira. In his free time, he co-hosts the Datapääoma podcast and writes a blog under the same brand in Finnish.