This article was cowritten by Lucas Marshall and Mike Anderson of Milwaukee Tool. The opinions made herein are the authors’ own and do not necessarily signal a view/opinion from the company.
Generative artificial intelligence (AI) has ruled the headlines since the unveiling of popular AI apps like ChatGPT and Bard last November.
In the intervening months, AI applications—both good and bad—have rippled across everything from photo editing to cybersecurity. How might generative AI, then, fit into a construction workflow? That’s what we seek to answer in this article. We’ll explore the possibilities for supercharging an industry that has been slow to adopt technology. We’ll also uncover some of the possible threats that business owners should be cognizant of before adopting the technology in full force.
What is Generative Artificial Intelligence?
“Artificial intelligence” has taken on a broad and much debated range of meanings since the term was coined by computer scientist John McCarthy in 1956. A decent working, if somewhat ambiguous, definition for AI is any computer system that simulates an aspect of human cognition while performing a variety of tasks with a high degree of autonomy. Some forms of AI remain theoretical—particularly Strong AIs like Artificial General Intelligence (AGI) and Artificial Super Intelligence (ASI)—while others exist today and are available for virtually anyone to use online.
The type of AI you’re most likely familiar with is called Generative AI—deep-learning computer models that leverage probability and vast amounts of training data to respond to prompts from human users. Often taking the form of conversational chatbots, generative AI platforms can conjure entire blocks of text, computer code, realistic images, videos, music, voices, and more. In the AEC world, some architects use generative AI programs to rapidly create multitudes of digital building models in a process known as generative design.
Generative AI has existed since as early as the 1960s, but has experienced two major evolutionary leaps within the last two decades. The first, according to IBM, came in the late 2010s with the advent of neural networks capable of generating aged portraits and transforming photos into works of art. The world is still reeling in the aftershock of the second leap in generative AI evolution, which occurred when the now notorious large-language-model known as ChatGPT exploded onto the scene in November 2022.
The arrival of powerful generative AI tools like ChatGPT marks the dawn of a new and uncertain era. How will we use artificial intelligence? In what ways will it shape us, the world, and the construction industry more specifically? To begin to answer these questions, we must be as clear-eyed about the dangers of AI as we are excited about the opportunities it presents.
The Dangers of AI
Urging caution around AI isn’t the sole purview of skeptics and Luddites.
Many of the creators of modern AI—including OpenAI CEO Sam Altman, ex-Google CEO Eric Schmidt, and “AI Godfather” Geoffery Hinton—have expressed fears that AI technology will ultimately be harmful to humanity. More than 1,000 researchers and scientific leaders have called for a six-month halt on the development of new AI systems, writing in a shared open letter that the technology is a “profound risks to society and humanity.”
The threats posed by AI have rapidly accelerated from the potential and existential to the actual and real. While some scenarios remain within the realm of speculative science-fiction (for now), there are plenty of examples of serious real-world harms that have already been caused by artificial intelligence.
Disinformation Machines
Artificial intelligence has proven itself to be a powerful tool for creating and spreading disinformation. By typing a few simple commands into a prompt box, users of AI tools can deepfake everything from images, text, and video to voices of celebrities and loved ones with eerie fidelity and precision. In a politically divided society riven with culture wars and conspiracy theories, the threat of mass-deception posed by existing generative AI tools is too big to ignore.
Examples of AI misinformation abound: AI content farms are automating the mass-production of fake news online, counterfeiting everything from social media posts to video news broadcasts. Scammers have successfully deployed AI to defraud investors. In California, the Department of Financial Protection and Innovation warns of one shady crypto securities firm that has even used AI to fabricate an avatar of its fictional CEO. AI-voice-generators are also being weaponized in a rash of scams to trick people into believing they’ve received calls from distressed loved ones, who report that they’re being held ransom. Only it’s not their loved ones on the other end—it’s an AI generated clone.
Even when functioning correctly, top-tier generative AI platforms produce egregiously false information, otherwise referred to as “hallucinations.” In an early example, ChatGPT fabricated financial figures out of thin air during the announcement of its fusion with the Bing search engine in February. Meanwhile, Google parent company Alphabet lost $100 billion in stock value after its own chatbot, Bard, inaccurately reported facts about the origins of the first photos of an exoplanet outside our solar system.
What this means for construction: Overreliance on AI programs in data management and analysis could introduce vulnerabilities and disruptions into construction workflows and processes. Perhaps more important is the corrosive impact that AI could have on business relationships. In the wrong hands, AI can be used to falsify information and create fake personas for the purpose of defrauding construction companies and their clients. To combat this, AEC (Architecture, Engineering and Construction) professionals should retain human critical oversight to fact-check information delivered by AI platforms, follow cybersecurity best practices, and keep current on their AI literacy.
The Danger of Bodily Harm
AI need not take the liquid metal form of a T-1000 to cause serious damage in the physical world. Existing AI is already more than capable of stretching beyond its digitized borders to hurt and even kill people, especially when left unsupervised by a human user.
One of the most prominent examples is the catastrophic failure of an automated flight control program that caused the fatal crashes of two Boeing 737 Max jet liners in Oct. 2018 and March 2019, killing 346 people. Self-driving cars also cause hundreds of crashes per year, some of which are fatal—including the killing of a pedestrian in 2018 by a self-driving Uber vehicle. In another example, a Belgian man recently committed suicide after allegedly being encouraged to do so by an AI chatbot.
What this means for construction: Given the extremely high level of hazard in construction, AEC professionals should never, under any circumstances, hand over complete instrumental and decision-making power to an AI-guided machine. Automated construction equipment and robots can be helpful aides on the jobsite, but only if safeguards and human oversight are in place to protect workers from harm caused by possible malfunctions or a flaw in the machine learning process.
AI Bias
Speaking of flaws, AI can cause harm in a less obvious way by making racially and socially biased decisions that have enormous negative impacts in the real world. Increasingly, algorithms are enlisted to decide everything from who gets hired (or fired) to who gets a longer prison sentence. The false assumption underlying AI is that its decisions are inherently objective and fair, free from the prejudices and clouded judgments of a human mind. The truth however is that AI programs are built by humans, and therefore often end up reproducing the harmful biases of the individuals and systems that design them.
Self-driving cars, for instance, are more likely to hit you if you have dark skin, according to researchers at the Georgia Institute of Technology. Another eyebrow-raising example is the 2016 investigation by ProPublica. The report revealed that a criminal justice algorithm in Florida automatically mis-categorized African American defendants as “high risk” for recidivism, even when their offenses were minor in comparison to white defendants who were labeled “low risk.”
What this means for construction: Whether AI is involved or not, the construction industry must always endeavor to instill fairness and equity into its decision-making processes. Not only is it the right thing to do; it’s good for business.
Replacement by AI
Another major threat is that AI will automate people’s jobs. Indeed, one of the central issues fueling the ongoing strike by the Writer’s Guild of America is the fear that studios will replace human writers with AI programs for writing film and television scripts. These fears aren’t unfounded: A recent study by Goldman Sachs reports that 300 million people could lose their jobs to AI in the coming years. Even Silicon Valley, once thought to be an impregnable bastion of job security, is grappling with anxiety that AI will soon replace most positions within the tech industry.
What this means for construction: The danger of replacement as technology advances remains a perennial concern for all occupations. That said, the complete automation of heavy industry has yet to come to pass. While you’re more likely to find a construction robot on a jobsite today than you were 20 years ago, these machines function better as helpers and are (for now) incapable of fully replacing the human construction workforce. This isn’t to say that AEC professionals should let their guard down: the skilled labor shortage is ongoing and sufficiently advanced machines may someday prove an enticing alternative to some short-sighted construction firms. In the immediate term, however, the danger of AI replacing the human construction workforce is low.
AI Legal Troubles
A final danger that ought to give industry leaders pause is that many current generative AI platforms stand on shaky legal ground.
A wave of lawsuits against AI companies has already emerged and more may be on the way. Among the plaintiffs are heavy hitters like Getty Images, which alleges in a lawsuit against Stable Diffusion that the AI-image-platform stole 12 million of its photographs. Meanwhile, a group of artists has filed a class-action lawsuit against AI art-generators, alleging that the apps are stealing the work of millions of artists without their consent.
What this means for construction: The consequences of these lawsuits remain to be seen. Nevertheless, construction firms should exercise caution over how and when they deploy AI, as their choices may open them up to legal action in the future.
Lack of AI Ethics and Regulations
The many risks associated with AI cry out for regulation. Yet as of this writing, there are virtually no guardrails in place to protect people from the runaway development and deployment of AI technology. To add to this troubling reality, many of the attempts to introduce ethical safeguards have been quashed.
A prominent example is Timnit Gebru, the former AI-ethicist for Google who was ousted from the company in 2020 after writing a paper warning of the racial and socio-political biases encoded into existing AI programs. This event foreshadowed Microsoft’s wholesale firing of its entire AI ethics team in early March of this year.
And yet regulation has the support OpenAI CEO Sam Altman, who recently testified before Congress about the need for stronger rules governing the proper use of artificial intelligence.
Generative AI and the Construction Industry: Which Workflows Win?
Despite text-based generative AI largely being discussed across creative fields like entertainment and marketing, applications in the AEC industry are slowly emerging, just as visual generative design has already been integrated by architectural SaaS heavy-hitters like Autodesk.
Whether you’ve already been using ChatGPT, services like it, or are on the fence about them, the floodgates have already been opened.
Experts agree that a positive way forward is finding legitimate ways to use the technology to augment, not replace human workers while as an industry and larger society acknowledging and understanding the associated risks and finding meaningful ways to address them.
The Construction Supply Chain
Writing for Forbes, Russell Haworth believes Natural Language Processing (NLP) may be “fine-tuned […to] slash research time” in the hands of architectural designers as well as to empower supply chain and inventory managers, potentially “suggesting the best product matches and even answering questions relating to product performance.”
In the hands of these individuals, Haworth believes the technology can be used as a “decision-making tool” to curb supply chain issues and be competitive while other construction companies, ignoring the potentials of the technology, may stumble. The technology, he expounds, is capable of alerting “designers to a product’s feature or availability, ensuring that they are up to date with the latest product information.”
Considering costs for materials like prefabricated steel and lumber recently have surged by 45% and 30%, respectively, the dynamic use of NLP in this capacity has huge potentials in an industry known for profit margins in the sub 3% range. What’s more, Haworth believes, if “Deployed correctly, [the technology] could deliver improvements to energy usage, carbon calculations, and on-site safety and security.”
Robert LaCosse, a Senior UX Researcher for Milwaukee Tool, agrees that natural language processing may have a hand to play in construction cost management, helping to “optimize for material purchase and use,” he explains.
With construction’s major challenges falling between materials and labor utilization, LaCosse believes AI can help businesses optimize their operations by reducing existing waste and helping maximize efficiency.
Explaining about the potential power generative AI has, he highlights chatbots. “A chatbot that has a complete history and knowledge of every conversation with that Superintendent,” he expounds, “It potentially has access to the data that describes that construction site, so it already has an awareness of what might be needed.”
“Over time,” he adds, the chatbot can pull from “archival or historic data that would already inform and potentially lend a predictive element to phone orders for resupplies that are made throughout the history of a project based on prior project models.”
He also believes that AI could have a hand to play in project management.
Project Management
Downtime, LaCosse explains, can be as high as 30% on some jobs simply due to less-experienced staff waiting around for orders. Traditionally speaking, he elaborates, “We rely upon a superintendent or a foreman or journeyman in any senior position to essentially tell junior positions where they need to be, what they need to be doing.”
“This is another area where I think an AI assistant may be useful,” he explains. “I don’t think that the role of an AI is to be a director on a jobsite. But I do think it has a significant role to play in advising, advising with next level strategic thinking.”
LaCosse believes AI tools can help project managers with tasks like:
- Calendar layout
- Project scheduling
- Path of execution
Using AI in project management could be game-changing. “Because for every hour of downtime removed, you are pushing that profit margin up and ultimately facilitating the growth and optimization of the company doing the work,” he explains.
Quality Assurance and Permitting
LaCosse also believes generative AI may have a part to play to support QA processes within construction work, particularly with the “labor scarcity at the current trajectory.”
“Generative AI has a very near-term role to play in QA for architectural drawings, layout maps, and QA,” he explains, helping “reconcile proposed architectural drawings and project outlines with permitting and code structures, [which] stands to benefit everybody.”
He expounds, “The permit office from the municipality […] can get more permits and licensing processed, which is ultimately good for the tax base.”
“It’s good for the construction company because they can expedite, they can have a shorter setup and job initiation runway.” He concludes, “it’s ultimately good for everyone in the field because the permitting, the field permitting inspectors, they’re going to have less ground to cover because they’re not going to have to have eyes on the prize to go through a manual process of a lot of permit approval.”
When properly deployed, he explains, construction teams will be empowered to work with more agility in the field, “able to essentially be dotting around in the field to jobsites. Just cross checking what the AI has reconciled between the plans and the permitting and simply offering validation that everything is going to plan, which I think is the appropriate use.”
This use case LaCrosse outlines is one that runs parallel to Haworth’s.
Construction Safety Training
Jason Braun, an Instructional Designer, believes GPT has a place in teaching apprentices and negotiating jobsite safety as the industry’s need for skilled trades outpaces the supply of experienced tradespeople.
Braun recently co-edited and contributed to his first full-length book, Designing Context-Rich Learning by Extending Reality, which discusses novel uses of technology like augmented reality and user experience design relative to learning design. In addition to working in higher education, Braun has street cred, having worked heavy industrial construction at Praxiar and ConocoPhillips locations, where he attended safety briefings.
He has a positive outlook on how generative AI may improve education, not hindering it, sharing a recent TedTalk by Sal Kahn, CEO of Kahn Academy, which expresses the same views.
In the hands of a safety trainer, Braun believes the technology can be useful to break down concepts and reorganize them quickly based on needs. “What I really think it does well is if you have a concept that you need to teach right,” he explains. “You can use GPT to chunk down something into any number of sections or containers you need, as well as scheduling,” which he notes may also aid in project management.
“You’ve got a project plan and you plugged in your dates and your deliverables,” he says. If a sudden change occurs, GPT can help to dynamically adjust these dates and deliverables based on the new constraints. What would normally cause a project manager a lot of manual calculation, he explains, GPT can automate.
Braun also believes that GPT could be used as an instructional design tool, similar to how web developers and cybersecurity engineers are able to use “sandboxes,” testing environments used to safely experiment. A cybersecurity engineer, for example, may use a sandbox to safely inspect a piece of malware on a virtual machine, enabling them to understand how it works to offer better defenses against it without risking collateral damage to their own systems and network.
Similarly, Braun believes GPT could help provide immersive training to the next generation of skilled trades. “There could be some thought experiments or ways to scale down key concepts and the kind of learning you don’t want to do on the job and in someone’s house,” he explains, “Safe places to make mistakes.”
He wits, “How can I learn this principle like in a way where I am not ruining this cherry oak or some expensive wood, or where I’m not hooking it up to 180 volts and I’m only learning this principle with a 9 Volt battery, right?”
Other Examples of AI in Construction
Artificial intelligence, outside of the generative AI that’s ruling the headlines, is a large field, with applications in construction and many other disciplines.
Machine learning, a subset of artificial intelligence, uses algorithms to process data quickly, and is capable of developing technological breakthroughs that positively impact humanity. Machine learning in power tools, for example, has been used to develop safety features like anti-kickback. This feature prevents dangerous situations where a saw or a drill may unexpectedly bind up and violently propel back toward the operator, in some cases leading to serious injuries. Similarly, Apple has used machine learning, recently announcing how they’ve applied machine learning for cognitive accessibility, to assist the hearing and visually impaired.
Another novel use is in the medical field, says Sheldon Gaskell, an instructor of writing at the University of Colorado-Colorado Springs (UCCS). “In the medical field,” he explains, “there’s a lot of uses, like diagnosis technology that uses AI to identify skin cancer.”
Gaskell, who has taught in the English department of UCCS for the past 5 years, teaches a research argument, emphasis course in which he’s used AI as a theme. “My students are researching a lot of different areas within their careers and how AI is shaping them,” he explains. “We teach stasis theory,” he adds, “which is a concept of inquiry in the research process and going through different levels, categorizing the situation, finding causes and consequences, and then evaluating the circumstances of that issue.”
Gaskell is mostly concerned about the way in which AI is developed. “I think that the way that AI is being developed is really problematic and there’s no way of really stopping that development really,” he explains.
Currently, AI has the potential of being “developed in the military, which would be disastrous, or the corporate world, which is typically in the pursuit of money.” “Developing GPT in the humanities,” he adds, “is better than other alternatives.”
Outside of his concerns about AI development, he thinks “GPT might be beneficial because it’s going to be more understanding of how humans interact and communicate.”
GPT, he explains, may be beneficial for student writers “in the peer review process and brainstorming when writers are having trouble, helping brainstorm new ideas or avenues for their research or their writing as an ‘invention device.’”
Where Experts Agree: AI Should Augment, Not Replace Humans
The common theme that most experts agree on is how AI can augment the capacity for humans to do more work, which is particularly helpful in construction where labor is hard to come by. Fully replacing humans and using the technology for a mass exodus of human workers, they agree, is NOT the intended result. Or at the very least, large tech companies should not use AI as an excuse to perpetuate mass layoffs, but rather as a possibility to facilitate their workers to do their best work and make humanity, on the whole, happier.
“No matter how advanced, construction technology can only take us part of the way,” Haworth explains, “It still takes highly skilled minds to fact-check, analyze and understand the nuances required for each stage of construction.”
“If we’re serious about using generative AI in a commercial sense,” he adds, “then we must ensure that the data in question is up to date and in line with the latest laws and regulations.”
LaCosse agrees. “It’s not so that we can replace people with robots, but so that we can move those highly commodified human laborers with advanced reasoning capabilities into the higher skill and higher pay positions.”
He adds, “You don’t necessarily remove the human from the whole equation. You just find out where a human speaking to another human is best. That’s the secret sauce in this whole thing.”
Bottom Line
Generative AI isn’t going anywhere. As experts agree, finding responsible uses of the technology may offer construction companies a competitive edge. However, knowing the perils of using new technology is of equal importance. As the emerging adage goes: You won’t be replaced by AI; you’ll be replaced by someone using AI. We’ll offer these parting thoughts: Finding responsible uses for AI—including proper vetting—can be powerful. But it’s not an excuse for throwing caution to the wind. Consider what Jerry Levine, general counsel at London-based ContractPodAi, told Construction Dive: “If somebody is solely relying on AI-generated contracts for a construction project, they’re really just asking for trouble.”
About the Authors
Lucas Marshall is a Content Marketing and SEO Manager for Milwaukee Tool, where he’s supported digital products like ONE-KEY™ for the past four years.
Mike Anderson is a Content Marketer and Product Journalist for Milwaukee Tool, where he specializes in digital storytelling. Prior to joining Milwaukee Tool, he was a reporter for many years, where his work appeared in the New York Times and has garnered awards from the AP.
Their work can be read in the company’s Connectivity Blog.