The ground model approach does two things. It improves the technical deliverable by reducing uncertainty in the ground conditions enabling risks to be managed (and creating opportunities for good decisions to be made), but also optimises investigation work to avoid wasted cost.
“Let’s take an example,” says Rod Eddies, geophysics expert for Fugro, a leading geotechnical and survey services company, “let’s take a very simple example of a brownfield site, somewhere remote and a couple of hectares in size where the client wants to locate and investigate old mine shafts.
Discover B2B Marketing That Performs
Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.
“You could start drilling on a grid to locate the shafts, or you could actually do an investigation that locates them beforehand by remote means or by a desk study using legacy data. “That clearly has a technical and commercial outcome; reduced effort and optimised investigation in terms of cost and time.”
He adds: “This approach isn’t restricted to mine shafts of course, but to a spectrum of situations where potentially unforeseen ground conditions could present risk.” Early investigations can be as simple as legacy data or desk studies, then fatal flaw analysis to see if a site should be left alone – possibly due to faulting, instability.
Moving on from this, engineers might obtain data from the site itself to find out about the layering, the stratigraphy, what the structural discontinuities are and so on. This allows for the creation of domains in the site, each being of a particular character. A borehole programme can then be set up to better sample these domains. Then as boreholes go down, the data is fed into the model, refining it, and eventually the engineers have sufficient detail to produce an informed design.
“So for me the ground model is not a single concept, it describes an iterative process where its evolution depends on the ability of the client to budget and pay for further investigation, dependent on the changing needs of the project. It is something that evolves through the lifecycle of preconstruction, and even beyond,” says Eddies.
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataCONVINCING CLIENTS
A live topic at the company is persuading clients to adopt model-based investigations. Larger, more complex projects warrant it more than smaller, simpler ones, but there is a trend towards increasing complexity as construction becomes more sophisticated.
“We feel that there is often an advantage to the client adopting a ground model. But not every project is a nuclear power station, so one of the challenges is how to scale this approach to small and medium sized projects where appropriate.”
Client interest in the ground is risk, and uncertainty is risk. If there is no information, contractors price for that risk accordingly (or at least they should) and this potentially inflates the market price.
To the geotechnical experts, the starting point is ‘what does uncertainty in the subsurface contribute to the risk profile’. In the chalk geology of Southeast England, , a key uncertainty is the presence of solution features.
“That’s the interest to the geologists. But the risks associated with the solution features are the chance of collapse or other surface disturbance, and that is the part that is of interest to the client.
“In order to manage the risk, you have to step back and manage the subsurface uncertainty. It is subsurface uncertainty that needs to be addressed first. It’s the same as in, for example, faulting.
“A fault isn’t inherently a significant issue, but a fault that presents onerous geotechnical conditions, or one that is potentially capable of movement, presents risk. But you need to find out first, and characterise it.”
“The construction sector often has a fairly traditional approach to evaluating ground risk. Many site developments still go ahead without desk study or geophysical investigations. It might be that for a fraction of the cost, a lot of the first order problems might be found well in advance.
“Foreseeable problems become foreseen and assumptions of ground uniformity are much less frequently adopted.” EARLY INVOLVEMENT
The plea of all specialists is to be involved in projects at an earlier stage, and geotechnical engineers and geophysicists are no different. Early involvement pre-empts problems and provides solutions for a fraction of the cost of bringing fixes to the job later in the construction cycle.
Simon Brightwell, a business development executive at Fugro adds, “In a perfect world it would always follow that collegiate, early involvement process.
“When people get around the table with cups of tea years before a piece of infrastructure is built they can informally look at the pros and cons of different options, but it isn’t always like that.
“Companies do still have to operate in both ways and sometimes business is much more contractual, with a more formal, tendered setup.”
FUTURE CHANGES
However, technology and thinking are moving on. Eddies reflects: “I think – I’m an optimist – we have maintained a campaign over the last few years focused around the benefits of reducing uncertainty in the early phases of projects and I see that as an evolving process.
“It’s a hearts and minds campaign basically, convincing traditional areas of business to undertake investigation that is different to what they do now.
“But I think the other part of it is that there are certainly new technologies in the wings for, let’s say investigations for tunnels.”
“I guess what we might see is a better use of legacy data – but I think certainly we are going to see gradual improvements in near surface investigation tools.
“One example we are developing is a way of imaging stratigraphy and structure and deriving geotechnical properties with one pass. Using technology that has come from the oil and gas exploration sector and adapting it. This is broadband seismic multi-component technology.
“The big difference is the use of instrumentation that can record all parts of the wave field so you can extract geotechnical properties and imaging at the same time. The benefit of that is that everything is geo-referenced in the same place, it shortens your field programme, reduces HSE exposure and reduces cost – and produces a better technical deliverable.” Near surface
“From my perspective, in the geophysical group, a lot of the discussion we have revolves around how to improve near-surface information. It is a lot harder to try to characterise the top 50 m or 100 m than the next 3 or 4km.”
“This is because the near surface is highly variable, it’s where the surface meets the ground and where water is, weathering is, where chemical reactions take place. Where anthropogenic processes take place. It’s a dynamic environment and especially variable.
“If we could strip away the top 100m and then do our geophysics, we’d be a lot happier. But that’s also the depth within which tunnels are built, foundations and piles are put down etc. It is our reality.
“A couple of guys in the states made the prediction there would be more spent on characterising the top 100m than on oil and gas exploration in the near future. It might well come true before too long.
“One of the holy grails is bringing the oil exploration technology, know-how and experience into the near surface work environment.”
Visualisation
The industry is also moving towards visualisation and BIM in a big way. It is the information age, and with the wealth of data that can now be collected, users further along the construction cycle can be helped with newer ways of presenting it.
It has been a complaint in the past that not all information provided to the construction tiers is not adapted as well as it could be for their purposes.
In geotechnical terms, geophysical, airborne and historical documentation can all be collated into one large dataset and presented in 2D (and now 3D) formats.
This sector still has some way to go to catch up with other industries, but things are heading in the right direction. There is more thought in how data is packaged for downstream use.
Eddies adds: “A particular challenge with geophysics is taking data that isn’t formally depth-referenced and making it depth-referenceable with confidence, then presenting it in 3D.
“In the exploration sector, no one will put an eight-figure borehole down in the Gulf of Mexico without a full 3D picture of the ground down to several kilometres. But unfortunately the near surface environment is more complex and engineering projects work on a different cost base.
“We’re working on it.”
FINAL THOUGHTS
Brightwell concludes with later stages of the construction cycle: “Assessment and analysis of the ground should not just be confined to pre-construction as has been traditional. One of the key developments in modern engineering is the idea of intelligent structures.
“I think monitoring and understanding the condition and behaviour of your structure, the ground around it and possibly its interaction with other things that may be for example built above it or in the vicinity of it will become increasingly important.”
“This is something that starts at pre-construction, but if you are installing systems why not install them to help you build the asset and also to look after it through its lifecycle. So I think there’s a key thing here, that increasingly structures will be intelligent. The sensible strategy would be to integrate the way we look at their behaviour, movement and other parameters through the lifecycle, as opposed to separating monitoring regimes before and after construction.”
The thinking around the ground model approach is evolving, with ideas from inside and outside the industry driving it forward.