top of page

Data Security and the GPT Store: What You Need to Know!

Writer's picture: christos175christos175

The GPT Store is a brilliant idea. If you haven’t explored it yet, log into ChatGPT and take a look. It’s a marketplace filled with millions of custom versions of ChatGPT, each trained and built for specific tasks. I use several regularly, especially the ones from OpenAI itself, such as the Data Analyst GPT. You can read more about custom GPT's here.


However, while the GPT Store is a brilliant concept, it’s still in its infancy. That means it has some growing pains, and right now, it’s a bit of wild west out there. The biggest strength of the store (allowing anyone to create and publish a custom GPT) is also its greatest data security risk.


The Security Risks of Custom GPTs


For everyday users, the problem is simple: when trying out a new custom GPT, there’s no immediate way to tell how good it is. But the bigger risk is that many communicate with external domains (which could expose your data). Whilst there is a warning before a custom GPT connects to an external domain, this still poses a major security risk for individuals and businesses.


For businesses with teams or plus accounts, the situation is even more concerning. Any user by default has the ability to create and publish a custom GPT on the store. Worse still, this setting cannot be turned off unless you’re on an enterprise plan. This means an employee could accidentally share a custom GPT on the store, exposing sensitive instructions or internal knowledge to the public.


How to Protect Your Data


So, what can you do to minimise the risks? Here are some simple yet effective steps:


1. Disable Access to Third-Party GPTs (if you have a Teams account)


If you’re using ChatGPT in a business environment, go into settings and restrict access to third-party GPTs. This allows your team to continue using OpenAI’s official GPTs (such as Data Analyst) while blocking external ones that could pose a risk.


Disable 3rd party GPT's

2. Train Your Users on Sharing Settings


Educate employees on the sharing options when creating custom GPTs. Reinforce the importance of not sharing publicly unless explicitly approved by your IT or security team.


GPT sharing options to educate your team

3. Educate Users on External Connections


When a GPT is connected to an external data source, OpenAI does provide a warning message before use. Show your users a screenshot of this warning and help them understand what it means. Many people simply click past these notifications without reading them—making them a security risk.

GPT warning when connecting to an external domain

4. Encourage a "No Sensitive Data" Policy


Ensure employees avoid inputting personal or sensitive company data into any GPT. This simple practice can prevent accidental leaks.


Will OpenAI Address This Issue?


For businesses, this single flaw is a major issue. Right now, there’s no way for teams (without an enterprise account) to prevent employees from publishing custom GPTs. This seems like a simple fix OpenAI could implement (at least for Teams accounts) but until they do, businesses need to remain vigilant.


As it stands, this lack of control is one reason many companies are opting for Microsoft Copilot instead. User training can help, but at the end of the day, it’s much better to have strong security settings by default rather than relying on employees to follow best practices.


Final Thoughts


The GPT Store is a fantastic tool with huge potential, but its open nature means businesses must take extra care when using it. By limiting access to third-party GPTs, educating your team, and implementing strong data security practices, you can continue to enjoy the benefits of AI while keeping your company’s information safe.


Stay informed, stay secure, and let’s hope OpenAI rolls out better security controls in the near future!

Comments


Contact Us

01273 011205
  • Instagram
  • Facebook
  • Twitter
  • LinkedIn

Mailing Address Only: 69 Burstead Close, Brighton, BN1 7HT

Tel: 01273 011205

©2024 by Embrace AI Training Ltd.

Co Number: 15346209

bottom of page