Smart Construction Sites with Connected Sensors

Gideon Farrell
Converge
Published in
7 min readApr 17, 2015

--

This is from a talk I gave at the Big Data in AEC conference in London yesterday.

In the last few years, there has been a huge amount of interest in smart things. Everything must be smart. We have smart homes with smart thermostats, smart lights and smart fridges. We have smart buildings with smart management systems and we have smart cities with smart everything inside them.

So “smart” has become a bit of a buzzword, but there’s a point behind it. For something to be smart, we will have to give it the capability to act independently.

All these smart things should be capable of knowing their internal state and reacting accordingly.

Governments all over the world are becoming interested in smart cities. Good examples include the Songdo International Business District in South Korea, Wood Wharf here in London and the Delhi-Mumbai Industrial Corridor in India. These smart cities will monitor things like transport status, the flow of citizens through their arteries, air quality and so on. This allows local authorities to better react to abnormal phenomena (a spike in traffic, perhaps, or a transport outage), but also to better plan around how citizens normally interact with and use the urban environment.

Common to the smart home, the smart building and the smart city, data are collected on the internal and external environment and automatically translated into intelligent action to optimise things like energy usage, people flow or air quality.

Also common to all these things is construction; smart buildings have to be built. There is a lot of talk of how to build smart systems into the finished product, but not as much of how to make the building process smarter.

Old industry, new tricks

For a site to be “smart”, we have to start with it knowing what is going on inside it.

Traditionally, this is done manually. I’ll give you an example: at Converge, we met with section engineers who monitor the curing progress of in-situ concrete pours. The most basic way of doing this, obviously, is just to wait for a sufficiently long time that the concrete simply has to be at full strength. This is the zero-knowledge approach, and you’ll find out if you’re wrong when it crumbles.

Improving on this, we can simultaneously pour cubes of concrete, and send them off to a lab to be crushed at set intervals, giving us a reasonable estimate of the strength of our component.

Most sites still do one of these two methods. The section engineers that I mentioned do it a more advanced way. They stick a thermocouple in the block, and come by every day to take a reading of the temperature. They write that on a clipboard, run it to the site office, put it in a spreadsheet and then do some maths to turn it into a measure of strength.

In most cases, that’s as far as it goes, although there is some more sophisticated equipment out there, for example GSM-connected readers, or even wireless probes that send data back to your laptop.

I think we can all agree that the first two methods are pretty inefficient, and that even the third method seems a bit tedious (and it is; I asked). These last two though, they seem to have some good technology in them, right?

The problem with these is that they’re relying on a decades-old paradigm. All these systems report data to a single point, the engineer’s laptop, and it’s up to her to do something with those data. The process is still manual.

Modernisation

Two huge trends in technology have really exploded in the last decade: the Internet of Things (IoT, which has actually been around a lot longer as “M2M”) and Big Data.

Now, IoT has blown up in the consumer space. Every day there are new projects cropping up on kickstarter: connected bikes, connected thermostats, connected door-locks, connected pet-monitors, connected dummies for babies (I kid you not) and my all-time favourite, the connected tweeting fridge.

Much of this may seem whimsical, but it’s had a really important effect and that is in dramatically driving down the cost of connected hardware, and low-powered computing devices. A really good reason that the industry hasn’t been instrumented up sooner is that, without this downward pressure on the cost of connected hardware, it just wouldn’t be commercially feasible.

Big Data and the idea of “cloud computing” have completely changed how we think about data storage and processing. For example, instead of the data being processed by an engineer on her laptop, they can just put it in the cloud and it can be analysed there more cheaply, and completely automatically.

This leads us to a somewhat oxymoronic idea of both centralising and de-centralising our data storage and processing. We’re dismantling the siloing of data, so they are being stored “centrally” in this abstract notion of “the cloud”, but it is decentralised in that scaling happens across multiple servers (and sometimes multiple data-centres) but also in that these data are accessible from anywhere, they are no longer localised.

These two trends mean that it is now really economical to instrument construction sites, because not only is the hardware for adding connectivity to sensors much cheaper than it ever has been, but centralising data storage and processing removes the high overhead of setting up a local data facility and also makes the results accessible to those who need them.

In practice, taking the example I gave of monitoring concrete, this means that we can have continuous, automated, realtime monitoring of the curing progress of concrete on-site.

Introducing: Converge

This is where Converge comes in. We automate sensor monitoring for industry and we’ve started with construction. We’re currently in trials with Laing O’Rourke to instrument concrete pours with sensors, using our platform to extract the data and analyse them.

So let’s take concrete, looking only at the scale of one block. Previously, we could only know the progress of this block at set time intervals, and the data resided on a single machine. With all this new technology, we can know immediately when the block has reached the required strength, and anyone concerned can know, not just the engineer who has the numbers on her laptop.

Converge is in three parts: we attach a small device (we call it a “node”) to each sensor, which extracts the data from it and transmits it over a low-frequency mesh network to a Hub. This Hub then sends the data to our servers over a 3G, WiFi or Ethernet connection (depending on what’s available on the site) where our Platform stores the data, analyses them and provides the tools to manipulate and visualise the data easily.

We actually take this a step further and add in active notifications, texting or emailing the relevant parties when a block has reached the requisite strength. The main reason for this is that engineers shouldn’t have to interrupt their workflow to check another dashboard, but rather when the data are relevant, they should be notified.

So far so good, but that’s pretty Small Data. Looking at an enormous site, say the nuclear reactor at Hinkley Point C, where every day delay will can be in excess of £1m, knowing the exact state of your site at every moment in time can help you eliminate the majority of the delays. Not only that, but you improve safety because you know that it’s not safe to walk into a structure as the beams aren’t at full strength, for example. The site becomes a replete data set, logged centrally and accessible to all the right people.

A construction job is a process, and like all processes it has bottlenecks. Instrumenting a site with automated, continuously monitored, wireless sensors and processing the data these produce allows you to optimise that process, eliminating the majority of the bottlenecks.

These data don’t have to be passive, however. With the potential scale of data collection for a site, you can start running predictive models to optimise workforce deployment, you can utilise weather data to anticipate inclement conditions which will slow progress and plan around them and you can monitor the exact performance of your design to learn for future projects. In essence, by making sites “smart” using connected sensors, the process of construction becomes safer and more efficient, and the data from each site feeds into the learning set for future projects.

To finish off, I’ll paint a picture of how we plan on implementing this on-site: on a large site with many simultaneous pours, Converge monitors the progress of each component as it strengthens. By having an overall picture of the progress of the job, workforce requirements can be predicted day-by-day based on forecasts of how much of the concrete will have set, also taking into account weather conditions because external temperature affects the rate at which concrete sets. This way, the only people on site will be the people who actually need to be there, and this can be planned far in advance. The result is a construction site which understands its internal state and external environment and reacts accordingly to optimise for efficiency and safety. Embedded sensors can then be monitored over the lifetime of the building to provide data on the design’s performance. We all know buildings are over-designed, but now we can know by exactly how much.

Converge is a generalised data platform, and not limited to concrete monitoring or even to construction. Almost all large industries can benefit from automated monitoring of sensors. This is where we have started, however, and we believe that the technology we’re building to automate sensor monitoring on-site can revolutionise how construction sites operate, cutting costs, eliminating delays and improving safety and quality control.

To find out more, drop us an email at hello@converge.io

--

--

Co-founder / CTO of Converge (https://converge.io). Ex-Cantab, astrophysicist. Programming since I was a small child. I really like javascript.