top of page
Writer's pictureKen Larson

Don’t Forget About ‘Old-Fashioned’ Artificial Intelligence


“DEFENSE ONE” By Ross Wilkers


“For all of the hype about ChatGPT and its generative ilk, older machine-learning tools and techniques are still useful and getting better.”

_______________________________________________________________________________

“For all the talk in 2023 about generative artificial intelligence’s potential promises and perils, what could be described as “old-fashioned AI” can do the job just fine in many instances.


One way to think about old-fashioned AI versus generative AI is how in the former, machines do not talk back to the operator for the most part. Ask questions to a generative AI tool however, and those machines do talk back to the user.


The 2024 edition of Deloitte’s annual report on government technology trends to watch in the year ahead singles out generative AI as an area of interest for agencies, but the author also reminded me that the other category is advancing as well.


Scott Buchholz, chief technology officer for Deloitte’s government and public services practice, pointed out two categories where old-fashioned AI still comes into play: instances that are approved automatically because they are routine, and others that need further examination so humans can look at them most closely.


Predictive modeling functions in use cases such as foot traffic through a security checkpoint also remain suitable for AI as the world has known it, according to Buchholz.


“All of those other sorts of things that we’ve spent at least 10 or 20 years now trying to figure out and work through somehow, are incredibly important, and arguably in some cases more valuable than what can currently be done with generative AI,” Buchholz told me.


Deloitte’s report points out that generative AI does complicate the never-ending discussion over whether machines are capable of thought. ChatGPT and the others like it can at least imitate human cognition as they show promise for trawling through large volumes of content.


“There’s more experimentation going on than most people recognize, the government actually does have implementations of generative A I technologies in production,” Buchholz said. “They tend to do things like searching large bodies of policies and answering questions in English, which sounds a little mundane until you try reading government policies and then you might have some empathy for the reason why people would find that really exciting.”


Indeed, several pilots are already live across the public sector ecosystem in a push to better understand the full potential of generative AI tools, both the positive and negative.


Buchholz also pointed out one surprising aspect of that technology, perhaps surprising to some but will not be to others: the rate of adoption across commercial industries is not dramatically different than in government.


One unique aspect to government is the volume of legacy systems that accumulate large levels of technical debt over time, which Buchholz said naturally means more work is needed in that context than a comparable commercial setting.


Computing power also continues to be a high-priority discussion item across the entire technology ecosystem, given both a strong desire by many enterprises to move into cloud environments and a universal realization that everything of importance in the world runs on chips.


Deloitte’s report says that standard cloud services still provide “more than enough functionality for most business-as-usual operations,” but more specialized hardware including custom chips are required for more cutting-edge use cases.


Buchholz said that for the most part, the speeds of microprocessors have not dramatically improved over the past couple of years and the point is near where they can be made even smaller to make systems go faster.


The global hyperscale cloud providers are realizing that and coming up with special purpose chips, which could be used for singular and higher-end use cases like machine learning.


Buchholz said that means options are increasing for what workloads can be run, albeit more slowly if the chips have greater specialization.


“What’s happening is there’s this increasing diversity of ways you can do things and different cost performance tradeoffs,” Buchholz said.


Which brings the discussion to quantum computing, which looks to be becoming more real as time goes on and certainly taking up more of the spotlight. The Dec. 3 episode of CBS’ 60 Minutes program had a segment on the race to lead the world in quantum, for instance.


“Those may be starting to come online in the next two or three years if you believe vendors road maps and they can continue to meet them,” said Buchholz, who wears the dual hat of leading Deloitte’s global quantum efforts. “That becomes really exciting because at that point in time, we have new ways of solving problems that have new and interesting characteristics.”











ABOUT THE AUTHOR:

Ross Wilkers covers the business of government contracting, companies and trends that shape the market. He joined WT in 2017 and works with Editor-in-Chief Nick Wakeman to host and produce our WT 360 podcast that features discussions with the market’s leading executives and voices. Ross is a native of Northern Virginia and is an alumnus of George Mason University.

10 views0 comments

Kommentarer


Post: Blog2_Post
bottom of page