Beyond BIM: Data Driven Construction_Part 1 Principals
In the evolution of my role in the projects, my idea about digitalization in project management has evolved, and a fundamental part in which I have dedicated a lot of effort in the last 2 years, has been the management of the data generated by the project itself for the benefit of the same.
I had already warned you in my previous articles on Why BIM did not land? And If we don't call it BIM?, that the digitization of the industry and specifically the management of construction projects, was much more than the geometric modeling of the project and its coordination. That is why I have decided to share a serie of articles (let's see how many parts come out) where I explain my vision based on my experience.
Let's start by defining some basic concepts in this first part. Like in all my articles, I do not intend to set the truth, but to share the extent of my knowledge and experience, to open a window to debate and conversation with all those who want to take part.
To Begin What is the Data?
In our days all our environment and encounter surrounded by Data. When you go to work, in your personal life or on vacation. But how does this data work? Actually everything has to do with Ones and Zeros. When we send a photo through social networks, listen to a song on our device, or practically everything we do every day, that information is transferred through bits. Our devices use this binary code as a language to create digital content. Why would it be different in the management of a Project.
Type of data
DIKW Model
This model shows an evolution in which when speaking of "knowledge" we can refer to a cognitive process that starts from a simple piece of information and goes through different degrees of complexity until reaching wisdom. It is an ascending pyramidal representation, to understand how starting from the data, we go to Information, managing this information to knowledge, and processing this knowledge to Wisdom. In short, our ultimate goal is to take advantage of the data at our disposal for our own benefit, generating "wisdom" based on managing and processing the same data in different scenarios over and over again.
What is BIG DATA?
Big data is the term that describes a large volume of data, which grows exponentially over time. It is such a large and complex data set that none of the traditional data tools are capable of storing or processing it efficiently. Where does all the Big Data data come from? From ourselves. We manufacture them directly or indirectly second after second, and we are going to see an example. The idea is to facilitate the search and enter this data into machine learning systems to be able to have virtual assistants in the future.
Often, it has been resorted to explaining what Big Data consists of through the so-called 5 Vs. However, at the beginning there were only 3, although technology is advancing so much that some even speak of 7 terms in V with which than define them. The terms are as follows:
Volume: The volume refers to the amount of data generated and saved. The size of the data determines the potential value and insight and whether you can consider it as true big data. We're talking about so much data that it won't fit on a normal hard drive, even a really big one. A multitude of computers connected to each other are needed, forming what is customary to call a cluster.
Speed: Speed tells us about data that arrives non-stop. It can be tweets on a certain topic that are being recorded for later analysis, or data that comes from a presence sensor that emits a signal every time someone enters an establishment. After having studied a master's degree in Big Data, you will be able to determine if you are dealing with a structured, unstructured or semi-structured type of Big Data.
Variety: Variety refers to the type and nature of the data to help people analyze the data and use the results effectively. Big data uses text, images, audio and video. They also complete ordered pieces through data fusion.
Value: Value is a measure that allows us to determine if the data is important and if it pays to operate with it.
Veracity: Veracity refers to the point at which the data is reliable, that is, it is verified.
How much data are we talking about?
According to a study by the DOMO company, it is said that each person generates around 2Mb of information per second, if we multiply that by 8 billion people, that is 16 trillion Mb/s.
What is Data Driven
Taking into account what was explained in the previous section, it is to be assumed that today the vast majority of companies in the world, in all sectors, highly value data, not only that generated by their company, but also all those data that can add value to your business.
A Data Driven approach allows companies to examine and organize their data in order to better serve their customers and consumers. By using data to drive its actions, an organization can contextualize and personalize its message to its prospects and customers for a more customer-centric approach.
In a data-driven organization, the Data Driven approach stimulates a deeper analysis of the data, which is incorporated into its process of making a decision, influencing the direction the company takes and adding value and impact.
The success of decisions is based on the quality of information
It is important to bear in mind that the success of the decisions is based on the quality of the information that we have at the time of making it. This information normally comes from 3 sources. The theoretical "know how" of the professionals involved and their professional experience. What supposes a decision-making based on Reason and sometimes analytical data was used, but there was little and very biased information. With Big Data we are able to firstly have access to a much larger amount of data, and secondly to cross data, which exponentially the quality of the resulting data.
Data Driven: Concept
Without a doubt, data is a key ingredient. Of course, it can't be just any data; it has to be the correct information. The data set must be relevant to the question at hand. It also has to be timely, accurate, clean and unbiased, and perhaps most importantly, it has to be reliable. This is a difficult task. Data is always "dirtier" than you might imagine. There may be subtle hidden biases that can influence your conclusions, and data cleansing and filtering can be a difficult, expensive, and time-consuming operation. Companies spend 80% of their time collecting, cleaning, and preparing data, and only 20% of their time building models, analyzing, visualizing, and drawing conclusions from that data. As in almost everything, the importance is in the quality and not in the quantity.
Data Access
You need a culture of data sharing within the organization so that data can be brought together, you need people with the right skills to use that data. That can mean the mechanics of filtering and aggregating data, such as through a query language or Excel macros, but it also means people designing and choosing appropriate metrics to extract and track.
So for an organization to be data driven, there have to be humans in the loop, talent asking the right questions of the data, collaborators who have the skills to extract the right data and metrics, and humans using that data to inform the next steps.
Generation of Reports and analysis alerts
Reports and alerts are necessary but not sufficient features in the Data Driven approach. However, we should not underestimate the importance of both activities. Reporting is especially a very valuable component of a data-driven organization. It is not possible to make decisions without this. However, it is not enough: there are many organizations that focus on reporting and may have little or no actual (objective) analysis.
The reports tell you what happened in the past. It also provides a baseline from which to observe changes and trends. It may be interesting, but data driven is a cultural strategy.
To look into the future and participate in analysis, dig and find out why the numbers are changing and, where appropriate, make testable predictions or run experiments to collect more data that sheds light on why.
I will give two classic cases in data analysis as an example.
Recommended by LinkedIn
The first of them is that of the English army during WWII, they suffered many losses in lives and planes at the hands of enemy fire. For them they decided to reinforce the planes to prevent them from being shot down. After analyzing the impacts of the planes returning to the base, they thought about reinforcing the places with the greatest impacts, but the mathematician "Abraham Wald" told the British that they were not interpreting the data in the best way, since they were analyzing the data of the survivors. (Survivor bias).
Remembering that the artillery did not have precision and the shots were "random" in the fuselage, now it could be interpreted that "the parts that are not damaged should be protected", because if there were no planes that have returned with those shots, the most likely is that receiving shots in these areas substantially decreased the possibility of continuing the flight.
The second is known as the Anscombe quartet, it shows how data with the same statistical properties can be quite different in their representation.
These four data sets are different, but it turns out that they have the same arithmetic mean and variance of the x and y values, the same correlation, the same correlation coefficient, and the same regression line, some to 2 or 3 decimal places. They are the Anscombe Quartet, named after "F.J. Anscombe", a mathematical statesman who published them in 1973. They are often used to teach that in addition to calculating the statistical properties of data, it is convenient to visualize them.
In all cases the plots tell us something more about the data: the former seem somewhat random but related, the latter show a clear but markedly different pattern; in the third and fourth there are other patterns clouded by some outliers. These values can be errors, real data that is just out of the ordinary, or even artificially produced data to fit everything.
As in the previous case, do not blindly trust the data and neither the statistics that you obtain from them; It also tries to mount a visualization to understand them, as well as an analysis of their interpretation.
Data Driven: To take into account
Working with data requires additional effort. As with the implementation of other technologies, there is a learning curve that must be assumed by the business. For good data management, previous work is necessary, an expense in additional resources.
It also requires implementing technology, training, resources and time, ultimately money. Many companies are reluctant to make these leaps of faith when it comes to technological implementations, especially since in many cases it takes time to visualize the return on investment in a short period.
Data Driven: Bad practice
There are several reasons why many companies, more than 70% according to some studies, consider that despite having implemented data analysis processes and technology in their companies, the return on investment is negative, and although they visualize the potential of the technology, today they are not able to take advantage of the analysis of the data.
These are some of the reasons analyzed why companies still fail to implement this new way of working, despite considering themselves Data Driven companies.
1. The wrong questions are asked
This is by far the most common cause of data project failure, and the reason is simple: organization leaders are not data experts. They don't understand the highly technical language of analytics, they often don't know what data they need to answer their business questions. Data experts are data experts. So it's your job to make sure your business stakeholders are asking the right questions. Requirements gathering is a vital part of any data and information project, helping to really define the scope of the project.
2. Lack of collaboration, siloed databases
There is a communication gap between data professionals and their business clients, leading to disconnection at every stage of a data project. While the first group is fluent in the language of analytics, the other is much more comfortable communicating in the language of business and project management. This can create challenges when translating your analysis into actionable insights.
Additionally, many data teams use systems like Jira to manage parts of the insight process. Platforms like these often work as a barrier for business clients because they tend to be less "technical" than data teams. Reluctance to use systems is why 90% of data teams engage in messy email chains to gather requirements. As a result, data creators and consumers do not interact properly, leading to a disconnect. And what happens when the most important project stakeholder is not involved in the process throughout the entire process? They do not extract valuable information or encourage positive business results. Data projects cannot be successful without project stakeholders working closely together. Everyone must work from a common platform, always knowing the status of their work and where and how to participate.
3. Data is not shared with the right actors
Another reason, another gap between the technical understanding and experience of time-strapped data professionals and domain leaders. Clear communication is vital to the success of any data analysis. Organizational leaders may not be as knowledgeable about extracting the right points that influence their business decisions as data teams. Therefore, the data must be presented and explained in a way that is easily digestible to the target audiences. Data visualizations are often a highly memorable way to communicate insights to ensure your actionability and drive business urgency and action.
Only 14% of data, analytics and information consumers identify that a business decision or action is always made because of the information they receive. Without a result, data projects are nothing more than numbers. If information is not clearly communicated to the right stakeholders, despite the hard work of data professionals, there is a high chance that it will not be implemented and will be left behind and forgotten.
4. Limitations on interoperability between platforms
The importance of an organized and easily searchable database to obtain information and knowledge of organizational data cannot be underestimated. The problem is that they often don't fit into places or tools we already have because they aren't designed for that purpose. This results in a difficulty in finding and reusing insights from previous data. If ideas cannot be discovered, they are useless.
To avoid potential project failures, the organization's data must be highly discoverable, easily accessible, consumable, and usable. Having centralized management for your organization's knowledge, rather than a maze of system folders, helps increase efficiency and security, improves workflow, and simplifies regulatory compliance, to name a few.
5. There is no standardization
Data projects are challenging. Request collection often involves a large amount of email between data professionals and their business clients. The scope of stakeholder needs changes frequently throughout the development process.
These are common frustrations that, unfortunately, can contribute to the ultimate failure of the project. The good news is that they are easy to eliminate by implementing systems and standardizations that work for both data professionals and their clients. Having the right tools in place ensures that the project outcome meets the needs of your business client, driving action and business value.
Data Driven: Benefits
Data drives an analytical approach to decision making
The correct analysis of the data reduces the risks in decision making
Data facilitates the management of complex decisions
Data generates greater certainty and confidence
Data helps faster learning
Data helps better decision making
Data helps visualize impacts on multiple factors simultaneously
Data helps convince in cases of uncertainty
Data helps knowledge management
Correct data analysis eliminates false positives
To be continued.....Part 2_Structure and Tools
Teknisk designer og Bygningkonstruktør hos Dansk El & Energi ApS
1yGreat article! I like the structure of defining the "data" and the impact it has by the how it is handled
BuiltTech Advanced Program at IE School of Architecture and Design
2yDirk Bakker
Teamlead Technology @DC | Tech Lead of the DigiChecks Project
2yInteresting article Ignacio! Looking forward to your next one!