Laverage
Jan 08 2022 at 23:30 GMT
I am using Google Cloud Storage to store user-generated content in my app.
However, when developing my app locally, I do not want to use my production Google Cloud Storage bucket.
Is there any way to mock Google Cloud Storage locally, so that uploaded files are stored on my file system and can be easily accessed on a localhost
URL?
Mike The Programmer
Jan 09 2022 at 13:25 GMT
You can mock Google Cloud Storage with Docker by using the fsouza/fake-gcs-server Docker image.
Here's how you can run it locally using docker run
:
docker run -d -p 5050:4443 -v storage_data:/storage fsouza/fake-gcs-server -scheme http
Or if you prefer docker-compose
:
version: '3.9'
services:
storage:
image: fsouza/fake-gcs-server
ports:
- '5050:4443'
command: -scheme http
volumes:
- storage_data:/storage
volumes:
storage_data:
Note that I specified -scheme http
so that the service can be accessed using http
instead of https
to not deal with TLS certificate issues.
Additionally, in order to persist uploaded files across container restarts, I specified the volume storage_data
mapping to the container's /storage
directory.
Once you have the service running locally, it will be accessible on localhost
on the specified port, in our example we chose 5050
.
You can instruct the Google Cloud Storage client to use the http://localhost:5050
API endpoint when in development.
The specific code to configure the client depends on the programming language that you're using.
So, if you're using the @google-cloud/storage
client for NodeJS, you would do the following when in development:
const Storage = require('@google-cloud/storage');
const storage = new Storage({ apiEndpoint: 'http://localhost:5050' });
It might be handy to have a getStorage
function that returns the storage configured based on the environment:
function getStorage() {
if (inDevelopment) {
return new Storage({ apiEndpoint: 'http://localhost:5050' });
} else {
return new Storage();
}
};
In order to have a bucket in the mock Google Cloud Storage, you need to manually create it.
Assuming that the bucket name is stored in the GOOGLE_CLOUD_STORAGE_BUCKET_NAME
environment variable, you can create it if it doesn't exist already using the following initBucket
function:
async function initBucket() {
if (inDevelopment) {
const bucketName = process.env.GOOGLE_CLOUD_STORAGE_BUCKET_NAME;
const storage = getStorage();
const [buckets] = await storage.getBuckets();
const foundBucket = buckets.find(({ name }) => name === bucketName);
if (foundBucket) {
console.log(`Bucket "${bucketName}" found!`);
return;
}
await storage.createBucket(bucketName);
console.log(`Bucket "${bucketName}" created!`);
}
}
You can call this function inside of a script that runs before you start your dev server so that you know that the bucket exists when your dev server starts.
Let's say you have a bucket named example.com
and you upload a file named image.jpg
, you can access it using the following URL:
http://localhost:5050/storage/v1/b/example.com/o/image.jpg?alt=media
Notice that ?alt=media
at the end. Without it, you would get a JSON payload with the info about that file instead of the actual file.
Also, you you can access a JSON with info about all the uploaded files using the following URL:
http://localhost:5050/storage/v1/b/example.com/o
Since the file URLs differ between development and production, it might be handy to have a getFileUrl
helper function:
const bucketName = process.env.GOOGLE_CLOUD_STORAGE_BUCKET_NAME;
function getFileUrl(fileName: string) {
if (inDevelopment) {
return `http://localhost:5050/storage/v1/b/${bucketName}/o/${fileName}?alt=media`;
} else {
return `https://my-production-storage.example.com/${fileName}`;
}
}