Skip to main content


Kenshin's First Scar

Wandering who is the first person can make a scar on Kenshin's face. It is unexpected.

Create Effective Documentation for Software Project

As your software project grows, it may involve more contributors. If you build a platform that publishes APIs that can be consumed by the public, you may expect more users to use your platform. If you work on an internal project that involves many parties from several vendors, you may expect everyone can understand your project and collaborate well. In any scenario, effective documentation can help you achieve what you want. We should consider a  user-oriented design for our documentation which considers who will use our product and what goal our users pursue by reading the documentation. Sometimes, it can help us in developing the project itself by trying to see the project from a user perspective. These are types of common audiences and the information needed. Evaluators who examine whether the service or tool is useful. They need a high-level overview, a list of features, or expected benefits. New users who just learn the usage. Th

Terraform Cheat Sheet

Terraform has become more mature and can help us in many scenarios in provisioning infrastructure. These are a few scenarios that might be quite common for us in day-to-day jobs. Take values from another state as a data source This might be used when we already maintain a base state, and then a few child configurations need to access certain values from the base state. First, define the data source with attributes for accessing another state data "terraform_remote_state" "SOME_NAME" { backend = "local" config = { path = "/path/to/another/terraform.tfstate" } } Then, we can pass the value into any resources. For example, we output a value into another output value. output "public_ip" { value = data.terraform_remote_state.SOME_NAME.outputs.public_ip } Redeploy a resource This might be useful when we find an error in our resource that requires us to redeploy the resource. terraform apply -replace=&quo

Shape of My Heart

He deals the cards as a meditation And those he plays never suspect He doesn't play for the money he wins He doesn't play for respect He deals the cards to find the answer The sacred geometry of chance The hidden law of a probable outcome The numbers lead a dance I know that the spades are the swords of a soldier I know that the clubs are weapons of war I know that diamonds mean money for this art But that's not the shape of my heart He may play the jack of diamonds He may lay the queen of spades He may conceal a king in his hand While the memory of it fades I know that the spades are the swords of a soldier I know that the clubs are weapons of war I know that diamonds mean money for this art But that's not the shape of my heart That's not the shape The shape of my heart If I told her that I loved you You'd maybe think there's something wrong I'm not a man of too many faces The mask I wear is one But those who speak know nothing And find out to t

Managing S3-Compatible Storage Using CLI Tool

Most S3-compatible storage providers like UpCloud and DigitalOcean provide a dashboard for managing our storage. But, usually, we face some browser or web-related issues in certain conditions for example when we try to upload large amounts of files. There are some CLI tools out there that we can use for managing our storage like uploading files, migrating files to another bucket, etc. One of the popular CLI tools is S3cmd . For instance, I use an object storage service provided by UpCloud . For this past year, I migrated many of my services from AWS and DigitalOcean to UpCloud because of its cost and performance. I found that UpCloud actively develops new features or services and improves its infrastructure performance. To install S3cmd , we need to have Python and PIP in our machine. After that, we can run the following command to install S3cmd . pip install s3cmd Then, we can configure the tool by running the following command. Four fields are important in our ca

Invisible Closure Scope in Javascript

When we are maintaining variables in our Javascript code, we must already know about the scope that determines the visibility of variables. There are three types of scope which are block, function, and global scope. A variable defined inside a function is not visible (accessible) from outside the function. But, the variable is visible to any blocks or functions inside the function. When a function is created, it has access to variables in the parent scope and its own scope, and it is known as closure scope. For example, if we create a function (child) inside another function (parent), in the creation time the child function will also have access to variables declared in its parent. Another way to think of closure is that every function in JavaScript has a hidden property called "Scope", which contains a reference to the environment where the function was created. The environment consists of the local variables, parameters, and arguments that were available to the

Threat Vectors in Cybersecurity

In cybersecurity, a threat is the potential occurrence of an undesirable event that can eventually damage or disrupt the operational and functional activities of a company or organization. Some examples are an attacker stealing sensitive data, infecting a system with malware, and data tampering. In order to realize their intentions, threats need vectors. A threat vector is a medium through which an attacker gains access to a system by exploiting identified vulnerabilities. Some most common threat vectors used by adversaries are as follows. Direct/physical access : By having direct access to our computing devices, the attacker can perform many malicious activities like installing malicious programs, copying a large amount of data, modifying device configuration, and so on. Protection : We should implement strict access control and restriction. Removable media : Devices like USB flash drives, smartphones, or IoT devices may contain malicious programs

Utilise GraphQL and Apollo Client for Maintaining React State

One library that is quite popular to allow our application to interact with the GraphQL server is Apollo Client ( @apollo/client ). It has support for several popular client libraries including React. It also provides cache management functionality to improve the performance of the application when interacting with GraphQL. Rather than integrating another library to manage our application state, we can leverage what Apollo Client already has to maintain the state. The solution is achieved by creating a customised query to load the result from a local variable or storage. Then, the query is executed like other queries using useQuery() . The steps are as follows. Create a local query that will read data only from the cache. Create a cache that defines a procedure for the query to read data from local values. Call the query anytime we want to read the state value. For instance, we want to read the information of the current user that has successfully logged in

Deploying Infrastructures Using Terraform on UpCloud

Terraform is a tool to help us deploy infrastructures on any cloud provider such as AWS, GCP, DigitalOcean, and many more. Unlike Amazon CloudFormation which is specific only for AWS, Terraform supports many cloud providers found in Terraform's registry. It uses a domain-specific language built clearly for provisioning and configuring infrastructures named HCL or HashiCorp Configuration Language. Meanwhile, UpCloud is an alternative cloud provider for SMEs. It targets a quite similar segment to DigialOcean and Linode. It provides a variety of popular solutions in the cloud such as managed Redis database, S3-compatible storage, private network, load balancer, and so on. Even though its cost is a little bit higher than DigitaOcean or others, it provides quite complete features on each service like the features of the load balancer that we will use in this post. Moreover, it actively publishes new features like the managed OpenSearch database published rece

The Truth About Reiner and Bertolt

"As a warrior, no road left but the one that leads to the end."

Communicate Through RabbitMQ Using NodeJS and Fastify

If we have two or more services that need to talk to each other but it is allowed to be asynchronous, we can implement a queue system in our system using RabbitMQ. RabbitMQ server will maintain all queues and connections to all services connected to it. This post will utilize Fastify as a NodeJS framework to build our program. This framework is similar to Express but implements some unique features like a plugin concept and an improved request-respond handler. First, we need to create two plugins, one is for sending a message, another one is for consuming the sent message. At first, we will make it using a normal queue. There is one mechanism for how a queue works, it is like a queue in the real world. When there are five persons in a queue and three staff to handle the queue, one person is served only by one staff, there is no need for other staff to handle any person that has been served, and there is no need for a person to be handled repeatedly by other staffs. For example

Is Data Important

Today, many systems may generate huge amounts of data such as system logs, financial transactions, customer profiles, security incidents, and so on. It is encouraged by the advancement of some technologies like IoT, mobile devices, and cloud computing. There are also fields that specifically learn to manage and process a lot of data like data science and machine learning. A set of data can be processed to produce certain results like detecting anomalies, predicting the future, or describing the state of a system. To generate such a result, the typical phases are collecting data, data preparation, visualization, and data analysis or generating results. In collecting data, we have to take some considerations including the location where the data will be stored, the type of stored data, and the retrieval method or how other systems can consume the data. When we want to select a location, we should consider whether the storage is available in the cloud or on-premise infrastructure,