Organizations are both already adopting GenAI options, evaluating methods for integrating these instruments into their enterprise plans, or each. To drive knowledgeable decision-making and efficient planning, the provision of arduous information is crucial—but such information stays surprisingly scarce.
The “Enterprise GenAI Data Security Report 2025” by LayerX delivers unprecedented insights into the sensible utility of AI instruments within the office, whereas highlighting important vulnerabilities. Drawing on real-world telemetry from LayerX’s enterprise purchasers, this report is among the few dependable sources that particulars precise worker use of GenAI.
As an illustration, it reveals that almost 90% of enterprise AI utilization happens exterior the visibility of IT, exposing organizations to vital dangers resembling information leakage and unauthorized entry.
Under we convey among the report’s key findings. Learn the complete report back to refine and improve your safety methods, leverage data-driven decision-making for danger administration, and evangelize for assets to reinforce GenAI information safety measures.
To register to a webinar that can cowl the important thing findings on this report, click here.
Use of GenAI within the Enterprise is Informal at Most (for Now)
Whereas the GenAI hype might make it appear to be the whole workforce has transitioned their workplace operations to GenAI, LayerX finds the precise use a tad extra lukewarm. Roughly 15% of customers entry GenAI instruments each day. This isn’t a proportion to be ignored, however it’s not the bulk.
But. Right here at The New Stack we concur with LayerX’s evaluation, predicting this development will speed up shortly. Particularly since 50% of customers presently use GenAI each different week.
As well as, they discover that 39% of normal GenAI device customers are software program builders, which means that the best potential of knowledge leakage by way of GenAI is of supply and proprietary code, in addition to the chance of utilizing dangerous code in your codebase.
How is GenAI Being Used? Who Is aware of?
Since LayerX is located within the browser, the device has visibility into the usage of Shadow SaaS. This implies they’ll see staff utilizing instruments that weren’t accredited by the group’s IT or by way of non-corporate accounts.
And whereas GenAI instruments like ChatGPT are used for work functions, practically 72% of staff entry them by way of their private accounts. If staff do entry by way of company accounts, solely about 12% is finished with SSO. In consequence, practically 90% of GenAI utilization is invisible to the group. This leaves organizations blind to ‘shadow AI’ functions and the unsanctioned sharing of company data on AI instruments.
50% of Pasting Exercise intoGenAI Contains Company Information
Bear in mind the Pareto precept? On this case, whereas not all customers use GenAI each day, customers who do paste into GenAI functions, achieve this steadily and of probably confidential data.
LayerX discovered that pasting of company information happens nearly 4 occasions a day, on common, amongst customers who submit information to GenAI instruments. This might embrace enterprise data, buyer information, monetary plans, supply code, and many others.
How you can Plan for GenAI Utilization: What Enterprises Should Do Now
The findings within the report sign an pressing want for brand new safety methods to handle GenAI danger. Conventional safety instruments fail to handle the fashionable AI-driven office the place functions are browser-based. They lack the power to detect, management, and safe AI interactions on the supply—the browser.
Browser-based safety gives visibility into entry to AI SaaS functions, unknown AI functions past ChatGOT, AI-enabled browser extensions, and extra. This visibility can be utilized to make use of DLP options for GenAI, permitting enterprises to soundly embrace GenAI of their plans, future-proofing their enterprise.
To entry extra information on how GenAI is getting used, learn the full report.
Source link