in

Copilots and low-code apps are creating a new ‘vast attack surface’ – 4 ways to fix that

Vertigo3d/Getty Images

Today’s average large enterprise is likely to have nearly 80,000 apps built out of copilots and low-code platforms. This is posing a potential security nightmare, as more than six out of ten, 62%, have security vulnerabilities, a recent study finds.

The study released by Zenity finds that enterprise copilots and low-code development are seeing 40% year-to-year growth in the use of these tools. The study is based on data surveyed and gathered from large organizations, but the implications are just as applicable to small to medium-sized businesses. 

Also: The line between citizen developers and IT pros gets fuzzier

Currently, the typical enterprise customer in the study has an average of 79,602 apps built across various copilots and low-code platforms. By comparison, the study’s authors estimate that the average large enterprise has at least 473 SaaS-based apps.

The study’s authors define “copilots” as the range of no-code and low-code tools and platforms including Microsoft Copilot, Power Platform, Salesforce, ServiceNow, Zapier, OpenAI, and more. The average large organization has about seven copilot and low-code platforms in use, they estimate. 

Among the 80,000 apps and copilots developed outside of the traditional software development lifecycle are roughly 50,000 vulnerabilities, the study concludes. The main risk cited is “business users having the ability to build apps and copilots without needing a coding background and without proper security guardrails in place,” the study’s authors note. The top technical risks seen with copilot and low-code platforms include authorization misuse, authentication failures, and data and secrets handling, the study finds. 

<!–>

“In traditional application development, apps are carefully built throughout the software development lifecycle, where each app is continuously planned, designed, implemented, measured, and analyzed,” they explain. “In modern business application development, however, no such checks and balances exists and a new form of shadow IT emerges.”

Within the range of copilot solutions, “anyone can build and access powerful business apps and copilots that access, transfer, and store sensitive data and contribute to critical business operations with just a couple clicks of the mouse or use of natural language text prompts,” the study cautions. “The velocity and magnitude of this new wave of application development creates a new and vast attack surface.” 

Also: The data suggests gen AI boosts software productivity – for these developers

Many enterprises encouraging copilot and low-code development are “not fully embracing that they need to contextualize and understand not only how many apps and copilots are being built, but also the business context such as what data the app interacts with, who it is intended for, and what business function it is meant to accomplish.”

As a result, “there are a lot of vulnerabilities and misconfigurations that are hard to contextualize and sort out who needs to do what to mitigate risk.”

Untrusted guest access via copilot and low-code apps is another issue. “The average enterprise in the study has over 8,641 instances of untrusted guest users having access to apps that are developed via copilots and low-code,” the study shows. More than 72% of those cases “provide privileged access to untrusted guests; meaning unmonitored and unmanaged guests can create, modify, or delete these apps.” 

Also: Code faster with generative AI, but beware the risks when you do

Here are some of the steps the study’s authors recommend to address these vulnerabilities:

  • Configure for security up front: Ensure that controls are in place “to flag any app that contains a hard-coded secret or insecure step in how it retrieves credentials,” they urge. “Contextualize apps that are being built to ensure that critical business apps that also come into contact with sensitive internal data have proper authentication controls. Once this is done, ensuring that proper authentication is in place for apps that require access to sensitive data is a top priority.”
  • Establish guardrails: “Due to the nature of copilots and AI in general, strict guardrails need to be in place in order to prevent oversharing apps, unnecessarily bridging access to sensitive data via AI, sharing end user interactions with copilots, and more. Without them, enterprises are staring down increased risks for malicious prompt injection and data leakage.”
  • Regulate guest access: Guest users “are held to different security standards as full-time employees yet still possess privileged access to apps and copilots built across low-code platforms,” the study’s authors point out. It’s critical to “limit application and copilot access to only who needs them in order to perform their respective duties.”
  • Rethink connectors to sensitive data: Understand which apps are connected to sensitive data, the authors recommend. “Then establish how data is sent and received to those applications, ensuring that any connectors, particularly those that access sensitive data, are using HTTPS calls.”  

–>


Source: Robotics - zdnet.com

I replaced my Bose with the Nothing Open – now I only want to run with them on

Join Sam’s Club for just $15 – the lowest price ever. Here’s how