Skip to main content

Managing S3-Compatible Storage Using CLI Tool

Most S3-compatible storage providers like UpCloud and DigitalOcean provide a dashboard for managing our storage. But, usually, we face some browser or web-related issues in certain conditions for example when we try to upload large amounts of files. There are some CLI tools out there that we can use for managing our storage like uploading files, migrating files to another bucket, etc.

One of the popular CLI tools is S3cmd. For instance, I use an object storage service provided by UpCloud. For this past year, I migrated many of my services from AWS and DigitalOcean to UpCloud because of its cost and performance. I found that UpCloud actively develops new features or services and improves its infrastructure performance.

To install S3cmd, we need to have Python and PIP in our machine. After that, we can run the following command to install S3cmd.

pip install s3cmd

Then, we can configure the tool by running the following command. Four fields are important in our case: access key ID, secret access key, storage endpoint address, and address templating. In UpCloud, the endpoint address is like your-storage-name.sg-sin1.upcloudobjects.com. While the address templating can be like %(bucket).your-storage-name.sg-sin1.upcloudobjects.com. It will let us access any bucket just by using this syntax: s3://your-bucket-name, without the complete address.

s3cmd --configure

After finishing the configuration, we can start to utilize the tool. The following example shows how to upload several files stored in a local folder to a bucket.

s3cmd put --recursive /path/to/source-folder/ s3://your-bucket-name/target-folder/

We should remember the trailing slash on the source folder path. It will copy only the contents of the source folder to the target folder. If there is no trailing slash, it will copy the folder and its content into the target folder.

We can also move files from one folder to another folder in our bucket.

s3cmd mv --recursive s3://your-bucket-name/source-folder/ s3://your-bucket-name/target-folder/

The following command is used to download files from the bucket.

s3cmd get s3://your-bucket-name/images/source.jpg /path/to/images/target.jpg

We can check the status of disk usage (e.g. size and object number) of a folder in our bucket by the following command.

s3cmd du s3://your-bucket-name/folder

There are many more commands that we can explore in the manuals.


Comments

Popular posts from this blog

Rangkaian Sensor Infrared dengan Photo Dioda

Keunggulan photodioda dibandingkan LDR adalah photodioda lebih tidak rentan terhadap noise karena hanya menerima sinar infrared, sedangkan LDR menerima seluruh cahaya yang ada termasuk infrared. Rangkaian yang akan kita gunakan adalah seperti gambar di bawah ini. Pada saat intensitas Infrared yang diterima Photodiode besar maka tahanan Photodiode menjadi kecil, sedangkan jika intensitas Infrared yang diterima Photodiode kecil maka tahanan yang dimiliki photodiode besar. Jika  tahanan photodiode kecil  maka tegangan  V- akan kecil . Misal tahanan photodiode mengecil menjadi 10kOhm. Maka dengan teorema pembagi tegangan: V- = Rrx/(Rrx + R2) x Vcc V- = 10 / (10+10) x Vcc V- = (1/2) x 5 Volt V- = 2.5 Volt Sedangkan jika  tahanan photodiode besar  maka tegangan  V- akan besar  (mendekati nilai Vcc). Misal tahanan photodiode menjadi 150kOhm. Maka dengan teorema pembagi tegangan: V- = Rrx/(Rrx + R2) x Vcc V- = 150 / (150+10) x Vcc V- = (150/160) x 5

Configuring Swap Memory on Ubuntu Using Ansible

If we maintain a Linux machine with a low memory capacity while we are required to run an application with high memory consumption, enabling swap memory is an option. Ansible can be utilized as a helper tool to automate the creation of swap memory. A swap file can be allocated in the available storage of the machine. The swap file then can be assigned as a swap memory. Firstly, we should prepare the inventory file. The following snippet is an example, you must provide your own configuration. [server] 192.168.1.2 [server:vars] ansible_user=root ansible_ssh_private_key_file=~/.ssh/id_rsa Secondly, we need to prepare the task file that contains not only the tasks but also some variables and connection information. For instance, we set /swapfile  as the name of our swap file. We also set the swap memory size to 2GB and the swappiness level to 60. - hosts: server become: true vars: swap_vars: size: 2G swappiness: 60 For simplicity, we only check the exi

Resize VirtualBox LVM Storage

VirtualBox is a free solution to host virtual machines on your computer. It provides configuration options for many components on our machine such as memory, storage, networking, etc. It also allows us to resize our machine storage after its operating system is installed. LVM is a volume manager in a Linux platform that helps us to allocate partitions in the system and configure the storage size that will be utilized for a specific volume group. There are some points to be noticed when we work with LVM on VirtualBox to resize our storage. These are some steps that need to be performed. 1. Stop your machine before resizing the storage. 2. Set new storage size using GUI by selecting " File > Virtual Media Manager > Properties " then find the desired virtual hard disk name that will be resized. OR , by running a CLI program located in " Program Files\Oracle\VirtualBox\VBoxManage.exe ".  cd "/c/Program Files/Oracle/VirtualBox" ./VBoxManage.exe list

Installing VSCode Server Manually on Ubuntu

I've ever gotten stuck on updating the VSCode server on my remote server because of an unstable connection between my remote server and visualstudio.com that host the updated server source codes. The download and update process failed over and over so I couldn't remotely access my remote files through VSCode. The solution is by downloading the server source codes through a host with a stable connection which in my case I downloaded from a cloud VPS server. Then I transfer the downloaded source codes as a compressed file to my remote server through SCP. Once the file had been on my remote sever, I extracted them and align the configuration. The more detailed steps are as follows. First, we should get the commit ID of our current VSCode application by clicking on the About option on the Help menu. The commit ID is a hexadecimal number like  92da9481c0904c6adfe372c12da3b7748d74bdcb . Then we can download the compressed server source codes as a single file from the host.

Managing MongoDB Records Using NestJS and Mongoose

NestJS is a framework for developing Node.js-based applications. It provides an additional abstraction layer on top of Express or other HTTP handlers and gives developers a stable foundation to build applications with structured procedures. Meanwhile, Mongoose is a schema modeling helper based on Node.js for MongoDB. There are several main steps to be performed for allowing our program to handle MongoDB records. First, we need to add the dependencies which are @nestjs/mongoose , mongoose , and @types/mongoose . Then, we need to define the connection configuration on the application module decorator. import { MongooseModule } from '@nestjs/mongoose'; @Module({ imports: [ MongooseModule.forRoot('mongodb://localhost:27017/mydb'), ], controllers: [AppController], providers: [AppService], }) Next, we create the schema definition using helpers provided by NestJS and Mongoose. The following snippet is an example with a declaration of index setting and an o

Beautiful Rain (JDorama)

Saya selalu tertarik dengan film-film inspirasional, baik movie atau pun serial drama. Akhir-akhir ini saya tertarik untuk menonton drama serial jepang. Saya googling dengan keyword "inspirational japan dorama" kemudian saya dapati sejumlah review  beberapa film bagus dari sejumlah netizen.  Salah satu yang kemudian saya tonton adalah Beautiful Rain . Setiap episode film ini selalu membuat saya sangat terharu sampai meneteskan air mata. :' Yah, ini mungkin saja karena saya yang terlalu melankolis. Hahaha. Ini sedikit review dari saya tentang film ini.