There are three ways that you can do that.
Locking a website down with a password is often the best approach if you want to keep your site private. Neither search engines nor random web users would be able to see your content, so it’s quite secure.
Blocking crawling is another approach. This is done with the robots.txt file. Random users of the web would still be able to use your website while well-behaved search engines would know not to access it. This isn’t an ideal approach, since search engines might still index the address of the website without accessing the content. It’s rare, but it can happen.
Blocking indexing is the other option. For this, you add a “noindex” robots meta-tag to your pages. This tells search engines not to index that page after they crawl it. Users don’t see the meta-tag and can still access the page normally.
Overall, for private content, our recommendation is to use password protection. It’s easy to check that it’s working, and it prevents anyone from accessing your content. Blocking crawling or indexing are good options when the content isn’t private or if they’re just parts of a website which you’d like to prevent from appearing in search.