Be Careful With the Data You Give DeepSeek… and Every Other AI

DeepSeek rocked the tech world and the financial markets when it hit the app stores a couple of weeks ago, promising to provide the same kinds of high-performing artificial intelligence models as the established players like OpenAI and Google at a fraction of the cost.

AI Atlas art badge tag

gettyimages-2197821478

Keep your private information private.

Getty Images

How to stay safe while using DeepSeek or other AI models

Given that it can be tough much of the time to know what AI model you’re actually using, experts say it’s best to take care when using any of them.

Here are some tips for doing that.

Be smart with AI just like with everything else. The usual best practices for tech apply here, too. Set long, complicated and unique passwords, always enable two-factor authentication when you can, and keep all your devices and software updated. 

Keep personal info personal. Think before entering personal details about yourself into an AI chatbot. Yes, this covers obvious no-no’s like Social Security numbers and banking information, but also the kinds of details that might not automatically set off alarm bells, like your address, place of employment, and friends’ or coworkers’ names.

Be skeptical. Just like you’d be wary of information requests that come in the form of emails, texts or social media posts, you should be concerned about AI queries, too. Think of it like a first date, Sirota said. If a model asks weirdly personal questions the first time you use it, walk away.

Don’t rush to be an early adopter. Just because an AI or app is trending doesn’t mean you have to have it right away, Morgan said. Decide for yourself how much risk you want to take when it comes to software that’s new to the market.

Read the terms and conditions. Yes, this is a lot to ask, but with any app or software, you should really read these statements before you start handing over data, to get an idea of where it’s going, what it’s being used for and who it could be shared with. Those statements could also provide insights into whether an AI or app is collecting and sharing data from other parts of your device, Borene said. If that’s the case, turn those permissions off. 

Be aware of America’s adversaries. Any app based in China should be treated with suspicion, but so should those from other adversarial or ungoverned states like Russia, Iran or North Korea, Borene said. Privacy rights you might enjoy in places like the US or European Union don’t apply on those apps, regardless of what the terms and conditions say.

Leave a Reply