DEV Community

Cover image for Provide private storage for internal company documents
Rahimah Sulayman
Rahimah Sulayman

Posted on

Provide private storage for internal company documents

Preface

"How do you provide external partners with access to sensitive data without compromising your entire directory? In this walkthrough, I focused on building a 'Smart' storage environment in Azure that handles its own housekeeping. I demonstrated how to move beyond basic 'Public/Private' settings to implement Shared Access Signatures (SAS) for time-limited, secure data sharing. Beyond security, I also explore how to automate data durability using Azure Object Replication, for automated, cross-account backups and ensuring that mission-critical documents are never more than a few clicks away from recovery."

"In a cloud-first world, storage isn't just about 'where' you put your files—it's about how you protect them and what they cost you. Most companies pay for 'Hot' storage they never actually use. A misconfigured storage account can lead to data leaks or ballooning monthly bills. In this project, I architected a secure, enterprise-grade storage solution in Azure, implementing Geo-Redundant Storage (GRS) **for regional failover, **Lifecycle Management to slash costs by 40-60%".

Business Task: The company needs storage for their offices and departments. This content is private to the company and shouldn't be shared without consent. This storage requires high availability if there's a regional outage. The company wants to use this storage to back up the public website storage.

Skills I exhibited

  • Create a storage account with high availability.
  • Ensure the storage account has anonymous public access.
  • Create a blob storage container for the website documents.
  • Enable soft delete so files can be easily restored.
  • Enable blob versioning.

Create a storage account and configure high availability.

Create a storage account for the internal private company documents.
Step 1: In the portal, search for and select Storage accounts.

storageacct

Step 2: Select + Create.

create

Step 3: Select the Resource group created previously, that is "rahimahrg".

rg

Step 4: Set the Storage account name to private. Add an identifier to the name to ensure the name is unique.

private

Step 5: Select Review + Create, and then Create the storage account.

create

Step 6: Wait for the storage account to deploy, and then select Go to resource.

gotoresource

This storage requires high availability if there’s a regional outage. Read access in the secondary region is not required. Configure the appropriate level of redundancy.

Step 1: In the storage account, in the Data management **section, select the **Redundancy blade.

redundancy

Step 2:Ensure **Geo-redundant storage (GRS) is selected.

geo

Step 3: Refresh the page.
Step 4: Review the primary and secondary location information.
Step 5: Save your changes.

save

Create a storage container, upload a file, and restrict access to the file.

Create a private storage container for the corporate data.
Step 1: In the storage account, in the Data storage section, select the Containers blade.

datastorage

Step 2: Select + Container.
Step 3: Ensure the Name of the container is private.

addcontainer

Step 4: Ensure the Public access level is Private (no anonymous access).

changeaccesslevel

Step 5: As you have time, review the Advanced settings, but take the defaults.

advsettings

Step 6: Select Ok.
privatenoanonymousaccess

For testing, upload a file to the private container. The type of file doesn’t matter. A small image or text file is a good choice. Test to ensure the file isn’t publicly accessible.
Step 1: Select the container.
Step 2: Select Upload.

privatecontainer

Step 3: Browse to files and select a file.

browse

Step 4: Upload the file.

upload

Step 5: Select the uploaded file.
Step 6: On the Overview tab, copy the URL.

copy

Step 7: Paste the URL into a new browser tab.
Step 8: Verify the file doesn’t display and you receive an error.

error

An external partner requires read and write access to the file for at least the next 24 hours. Configure and test a shared access signature (SAS).
Step 1: Select your uploaded blob file and move to the Generate SAS tab.
Step 2: In the Permissions drop-down, ensure the partner has only Read permissions.
Step 3: Verify the Start and expiry date/time is for the next 24 hours.
Step 4: Select Generate SAS token and URL.

SAS

SASurl

Step 5: Copy the Blob SAS URL to a new browser tab.
Step 6: Verify you can access the file. If you have uploaded an image file it will display in the browser. Other file types will be downloaded.

newbrowse

Configure storage access tiers and content replication.

To save on costs, after 30 days, move blobs from the hot tier to the cool tier.
Step 1: Return to the storage account.

storageacct

Step 2: In the Overview section, notice the Default access tier is set to Hot.

hot

Step 3: In the Data management section, select the Lifecycle management blade.

Step 4: Select Add rule.
lifecycle
Step 5: Set the Rule name to movetocool.
Step 6: Set the Rule scope to Apply rule to all blobs in the storage account.
Step 7: Select Next.

next

Step 8: Ensure Last modified is selected.
Step 9: Set More than (days ago) to 30.
Step 10: In the Then drop-down select Move to cool storage.
Step 11: As you have time, review other lifecycle options in the drop-down.
Step 12: Add the rule.

addrule

The public website files need to be backed up to another storage account.
- In your storage account, create a new container called backup. Use the default values. Refer back to Lab 02a if you need detailed instructions.
- Navigate to your publicwebsite storage account. This storage account was created in the previous post.
Step 1: In the Data management section, select the Object replication blade.
Step 2: Select Create replication rules.

replicationrules

Step 3: Set the Destination storage account to the private storage account.

destinationstorageacct

Step 4: Set the Source container to public and the Destination container to backup.
Step 5: Create the replication rule.

replicationrule

Optionally, as you have time, upload a file to the public container. Return to the private storage account and refresh the backup container. Within a few minutes your public website file will appear in the backup folder.

The Relevance of this Strategy

When designing enterprise storage, we aren't just looking for a "place to put files." We are solving for three critical business pillars: Durability, Security, and Cost Efficiency.

1. High Availability: Why GRS over LRS?
In a standard setup (LRS), your data is replicated three times within a single data center. But what if a regional disaster (like a flood or massive power failure) takes that entire region offline?

The Strategy: By choosing Geo-Redundant Storage (GRS), we ensure data is replicated to a secondary region hundreds of miles away.

The Business Value: This ensures "Six Nines" of durability ($99.999999999\%$), protecting the company from total data loss during a regional outage.

2. The Security Layer: Why SAS Tokens?
Handing out the primary account key is like giving someone the master key to your entire building. If it's leaked, you have to change every lock.

The Strategy: We use Shared Access Signatures (SAS). This provides "least privilege" access—limiting a partner to Read-only permissions for a specific file, with an expiration timer.

The Business Value: It minimizes the "attack surface." Even if the link is intercepted, it becomes useless after 24 hours.

3. Cost Optimization: The Lifecycle Strategy
Storing 1TB of data in the "Hot" tier is significantly more expensive than the "Cool" or "Archive" tiers. Most company documents are rarely accessed after 30 days.

The Strategy: We implemented an Automated Lifecycle Policy to move blobs to the Cool tier after 30 days.

The Business Value: This is a FinOps win. It reduces storage costs by up to 60% without requiring a single manual action from the IT team.

4. Object Replication: The "Fail-Safe"

Why replicate from a public website account to a private backup account?

The Strategy: This creates an "Air Gap" for your data. If the public account is compromised or accidentally deleted, the Object Replication ensures a block-level copy exists in a separate, private environment.

Architectural Summary

This implementation isn't just a collection of files; it’s a multi-layered data strategy designed for a modern enterprise environment. By the end of this configuration, we have achieved three core architectural goals:

1. Zero-Trust Data Access
By defaulting to Private Access and utilizing Shared Access Signatures (SAS), we eliminated the risk of "leaky buckets." We ensured that internal documents remain invisible to the public, while still providing granular, time-bound access to external partners without ever sharing root account keys.

2. Automated Disaster Recovery (DR)
The shift from Local Redundancy (LRS) to Geo-Redundant Storage (GRS) provides a fail-safe against regional outages. Combined with Object Replication from the public website to a private backup container, we’ve created a "dual-site" data presence that ensures business continuity even if an entire Azure region goes dark.

3. "Hands-Off" Cost Management
Through Lifecycle Management policies, the architecture is "self-cleaning." By automatically tiering aging data from Hot to Cool storage, we significantly reduced the long-term Total Cost of Ownership (TCO) without requiring manual intervention from the IT team.

The Result: A storage environment that is secure by default, resilient by design, and optimized for the bottom line.

Top comments (0)