Skip to main content

Posts

Showing posts from 2023

Kenshin's First Scar

Wandering who is the first person can make a scar on Kenshin's face. It is unexpected.

Create Effective Documentation for Software Project

As your software project grows, it may involve more contributors. If you build a platform that publishes APIs that can be consumed by the public, you may expect more users to use your platform. If you work on an internal project that involves many parties from several vendors, you may expect everyone can understand your project and collaborate well. In any scenario, effective documentation can help you achieve what you want. We should consider a  user-oriented design for our documentation which considers who will use our product and what goal our users pursue by reading the documentation. Sometimes, it can help us in developing the project itself by trying to see the project from a user perspective. These are types of common audiences and the information needed. Evaluators who examine whether the service or tool is useful. They need a high-level overview, a list of features, or expected benefits. New users who just learn the usage. Th

Terraform Cheat Sheet

Terraform has become more mature and can help us in many scenarios in provisioning infrastructure. These are a few scenarios that might be quite common for us in day-to-day jobs. Take values from another state as a data source This might be used when we already maintain a base state, and then a few child configurations need to access certain values from the base state. First, define the data source with attributes for accessing another state data "terraform_remote_state" "SOME_NAME" { backend = "local" config = { path = "/path/to/another/terraform.tfstate" } } Then, we can pass the value into any resources. For example, we output a value into another output value. output "public_ip" { value = data.terraform_remote_state.SOME_NAME.outputs.public_ip } Redeploy a resource This might be useful when we find an error in our resource that requires us to redeploy the resource. terraform apply -replace=&quo

Shape of My Heart

He deals the cards as a meditation And those he plays never suspect He doesn't play for the money he wins He doesn't play for respect He deals the cards to find the answer The sacred geometry of chance The hidden law of a probable outcome The numbers lead a dance I know that the spades are the swords of a soldier I know that the clubs are weapons of war I know that diamonds mean money for this art But that's not the shape of my heart He may play the jack of diamonds He may lay the queen of spades He may conceal a king in his hand While the memory of it fades I know that the spades are the swords of a soldier I know that the clubs are weapons of war I know that diamonds mean money for this art But that's not the shape of my heart That's not the shape The shape of my heart If I told her that I loved you You'd maybe think there's something wrong I'm not a man of too many faces The mask I wear is one But those who speak know nothing And find out to t

Managing S3-Compatible Storage Using CLI Tool

Most S3-compatible storage providers like UpCloud and DigitalOcean provide a dashboard for managing our storage. But, usually, we face some browser or web-related issues in certain conditions for example when we try to upload large amounts of files. There are some CLI tools out there that we can use for managing our storage like uploading files, migrating files to another bucket, etc. One of the popular CLI tools is S3cmd . For instance, I use an object storage service provided by UpCloud . For this past year, I migrated many of my services from AWS and DigitalOcean to UpCloud because of its cost and performance. I found that UpCloud actively develops new features or services and improves its infrastructure performance. To install S3cmd , we need to have Python and PIP in our machine. After that, we can run the following command to install S3cmd . pip install s3cmd Then, we can configure the tool by running the following command. Four fields are important in our ca

Invisible Closure Scope in Javascript

When we are maintaining variables in our Javascript code, we must already know about the scope that determines the visibility of variables. There are three types of scope which are block, function, and global scope. A variable defined inside a function is not visible (accessible) from outside the function. But, the variable is visible to any blocks or functions inside the function. When a function is created, it has access to variables in the parent scope and its own scope, and it is known as closure scope. For example, if we create a function (child) inside another function (parent), in the creation time the child function will also have access to variables declared in its parent. Another way to think of closure is that every function in JavaScript has a hidden property called "Scope", which contains a reference to the environment where the function was created. The environment consists of the local variables, parameters, and arguments that were available to the

Threat Vectors in Cybersecurity

In cybersecurity, a threat is the potential occurrence of an undesirable event that can eventually damage or disrupt the operational and functional activities of a company or organization. Some examples are an attacker stealing sensitive data, infecting a system with malware, and data tampering. In order to realize their intentions, threats need vectors. A threat vector is a medium through which an attacker gains access to a system by exploiting identified vulnerabilities. Some most common threat vectors used by adversaries are as follows. Direct/physical access : By having direct access to our computing devices, the attacker can perform many malicious activities like installing malicious programs, copying a large amount of data, modifying device configuration, and so on. Protection : We should implement strict access control and restriction. Removable media : Devices like USB flash drives, smartphones, or IoT devices may contain malicious programs

Utilise GraphQL and Apollo Client for Maintaining React State

One library that is quite popular to allow our application to interact with the GraphQL server is Apollo Client ( @apollo/client ). It has support for several popular client libraries including React. It also provides cache management functionality to improve the performance of the application when interacting with GraphQL. Rather than integrating another library to manage our application state, we can leverage what Apollo Client already has to maintain the state. The solution is achieved by creating a customised query to load the result from a local variable or storage. Then, the query is executed like other queries using useQuery() . The steps are as follows. Create a local query that will read data only from the cache. Create a cache that defines a procedure for the query to read data from local values. Call the query anytime we want to read the state value. For instance, we want to read the information of the current user that has successfully logged in

Deploying Infrastructures Using Terraform on UpCloud

Terraform is a tool to help us deploy infrastructures on any cloud provider such as AWS, GCP, DigitalOcean, and many more. Unlike Amazon CloudFormation which is specific only for AWS, Terraform supports many cloud providers found in Terraform's registry. It uses a domain-specific language built clearly for provisioning and configuring infrastructures named HCL or HashiCorp Configuration Language. Meanwhile, UpCloud is an alternative cloud provider for SMEs. It targets a quite similar segment to DigialOcean and Linode. It provides a variety of popular solutions in the cloud such as managed Redis database, S3-compatible storage, private network, load balancer, and so on. Even though its cost is a little bit higher than DigitaOcean or others, it provides quite complete features on each service like the features of the load balancer that we will use in this post. Moreover, it actively publishes new features like the managed OpenSearch database published rece

The Truth About Reiner and Bertolt

"As a warrior, no road left but the one that leads to the end."

Communicate Through RabbitMQ Using NodeJS and Fastify

If we have two or more services that need to talk to each other but it is allowed to be asynchronous, we can implement a queue system in our system using RabbitMQ. RabbitMQ server will maintain all queues and connections to all services connected to it. This post will utilize Fastify as a NodeJS framework to build our program. This framework is similar to Express but implements some unique features like a plugin concept and an improved request-respond handler. First, we need to create two plugins, one is for sending a message, another one is for consuming the sent message. At first, we will make it using a normal queue. There is one mechanism for how a queue works, it is like a queue in the real world. When there are five persons in a queue and three staff to handle the queue, one person is served only by one staff, there is no need for other staff to handle any person that has been served, and there is no need for a person to be handled repeatedly by other staffs. For example

Is Data Important

Today, many systems may generate huge amounts of data such as system logs, financial transactions, customer profiles, security incidents, and so on. It is encouraged by the advancement of some technologies like IoT, mobile devices, and cloud computing. There are also fields that specifically learn to manage and process a lot of data like data science and machine learning. A set of data can be processed to produce certain results like detecting anomalies, predicting the future, or describing the state of a system. To generate such a result, the typical phases are collecting data, data preparation, visualization, and data analysis or generating results. In collecting data, we have to take some considerations including the location where the data will be stored, the type of stored data, and the retrieval method or how other systems can consume the data. When we want to select a location, we should consider whether the storage is available in the cloud or on-premise infrastructure,

Manually Select Private Key for Git CLI

We may utilize different keys for different projects or accounts. When we pull data from a Git repository through an SSH connection, by default the Git tool will follow the default SSH configuration for selecting the key used which is located in ~/.ssh/id_rsa . We can also set a custom SSH configuration located in the  ~/.ssh/config  file that will be followed by the Git tool too as explained in my other post . For setting the private key locally or per session, there are other options. First, we can utilize an environment variable that will be read by the Git tool for selecting the correct SSH command which is GIT_SSH_COMMAND . The usage is as follows. GIT_SSH_COMMAND="ssh -i ~/.ssh/your_id_rsa -F /dev/null" git clone git@github.com:your/project.git The -F /dev/null parameter is used for ignoring any available SSH configuration in the host. This method will apply the custom SSH command during the user session or it can be permanent too by setting it in the host envi

Managing Password in Unix Using Pass

If you are looking for simple password management in Unix, pass maybe the answer. It utilizes GPG to encrypt the stored passwords. It stores the encrypted passwords as text files in a tree of directories. Each directory can maintain a separate GPG key for encrypting the passwords stored inside it. How easy is it? The following command shows how we can store a password and set AWS/access-key-id as the variable name to access it in the future. pass insert AWS/access-key-id The previous command will automatically create a directory named AWS inside the  ~/.password-store  directory which is the default location of pass storage. It also creates a file named access-key-id.gpg inside the ~/.password-store/AWS directory. To access the value we can call the following command. pass AWS/access-key-id There are some steps we need to run for utilizing the tool. Install pass using package manager Create a GPG key pair record Initialize the pass storage with the spe

The First Time Kenshin Met Hiko

"Tell me your name" "Shinta" "Too soft for a swordsman, as of today you are Kenshin"

Essentials Ansible Modules

Ansible is a reliable configuration management tool. It is shipped with a lot of modules including those provided by the communities. Some modules are essential and come in very handy in everyday tasks. Ansible is pushed-based and works by generating a Python script that will be run on the target server. It means the target server is required to have Python which is also commonly shipped in any Linux distros. package The module is used to manage packages in the target host. It is like running apt , yum , or aptitude . The following snippet is an example of its usage to install the Nginx package using the package manager. tasks: - name: Install Nginx package: name: nginx state: present update_cache: True file It is used to manage files, symlinks, links, or folders on the target host. These are the two examples. tasks: - name: Create a directory file: path: "/home/luki/mydir" state: directory mode: 0750 - name: C

Installing VSCode Server Manually on Ubuntu

I've ever gotten stuck on updating the VSCode server on my remote server because of an unstable connection between my remote server and visualstudio.com that host the updated server source codes. The download and update process failed over and over so I couldn't remotely access my remote files through VSCode. The solution is by downloading the server source codes through a host with a stable connection which in my case I downloaded from a cloud VPS server. Then I transfer the downloaded source codes as a compressed file to my remote server through SCP. Once the file had been on my remote sever, I extracted them and align the configuration. The more detailed steps are as follows. First, we should get the commit ID of our current VSCode application by clicking on the About option on the Help menu. The commit ID is a hexadecimal number like  92da9481c0904c6adfe372c12da3b7748d74bdcb . Then we can download the compressed server source codes as a single file from the host.

Creating Self-signed and CA Certificate using OpenSSL

A self-signed certificate is very useful for us when we are in a development or closed environment and require a secure communication channel between nodes in our system like implementing HTTPS for client-server communication. To make our self-signed certificate to be recognized by all nodes in the system, we should generate the CA certificate and distribute it to all nodes. This CA certificate is used to verify and determine the issuer of the self-signed certificate. It is like a stamp on a certificate that ensures the certificate is issued by the authority informed in the certificate itself. OpenSSL CLI tool will be used for this purpose. The following steps can be run to generate valid self-signed and CA certificates. Generate a private CA key Generate a public CA certificate Generate a private key for the target server Generate a CSR for the server Generate a public server certificate and sign it with the CA certificate Before we start the certificate generatio

Managing Kubernetes Cluster in Ubuntu

There are several tools and services in the market that we can use to deploy and manage Kubernetes clusters. Some tools allow us to self-manage the cluster without charge, some require us to subscribe for the license, and some services are provided as a Kubernetes as a Service such as Amazon EKS and Red Hat OpenShift Dedicated. For development purposes or small-scale service, there is a free tool to manage Kubernetes clusters named minikube which is available for Windows, Linux, and MacOS. To build the cluster, Minikube supports several virtualization technologies such as Hypervisor, KVM, Docker, and so on. In this writing, we will try to utilize Docker as the virtualization solution and run it in Ubuntu 22.04. In general, we will run through the following steps. Install Minikube Install Docker Allow a non-root user to access Docker Start Minikube Enable Ingress extension Install Kubectl Install Minikube First, we need to install Minikube's depen

Detecting Main Module in ESM Package

ESM package works in a different fashion from CJS. ESM package imports required modules asynchronously and have no support for several functions related to system files and directories. ESM follows Javascript cores that are implemented in the browser. There is no such entity like __dirname , __filename , require() , and so on. If we want to detect whether a module is a main module that is being run in CJS, we can compare the value of require.module and module . If it is the same, the module is being run as the main module. Meanwhile, for a module in the ESM package, we can use the following approach. Get the value of process.argv[1] that contains information of the main file that is being called by Node. Get the value of import.meta.url that contains information of the module's location that is being accessed. Transform the information to have a similar format, then compare it. For example, we have a module in an ESM package, named module1.js . import { r

Captain Levi Talked To Armin

At this point, Armin and the others start questioning their actions while Levi tries to justify what was happening.

Upgrading Gitlab Using Linux Package

Gitlab actively updates its products including the Community Edition (CE). The easiest way to install Gitlab CE on a Linux machine is by using the package manager. To add the repository to our system, we can run the following command. sudo apt update curl -s https://packages.gitlab.com/install/repositories/gitlab/gitlab-ce/script.deb.sh | sudo bash Before we run the upgrade process, we should follow the recommendation of the version order. If not, we may end up breaking our Gitlab service. Upgrading one version to another version far greater may require upgrading from previous versions first, step by step. To check our current Gitlab version, we can visit <OUR_GITLAB_ADDRESS>/help . The upgrade recommendation can be found  HERE . Note, we are also suggested to upgrade our Gitlab regularly for security reasons. We can get the list of available versions HERE . To apply the upgrade on our machine, we can run the following command. sudo apt-get install gitlab-ce=<VERSION_C

Prototypal Inheritance in Javascript

Some programming languages support object-oriented approaches natively by providing a feature to create an object instance based on a class definition. Meanwhile, in Javascript, we know there is  class syntax, but it is just a kind of syntactic sugar that allows us to instantiate an object that can inherit some traits from another object or class definition in a similar fashion to other programming languages. In Javascript, a class is just a function with a  prototype property that maintain traits that can be passed down to other object instance. There are several ways to allow property inheritance in Javascript. Functional In this way, we utilize the Object.create() method built in Javascript. The steps are: Create an object as the prototype provider. Pass the prototype provider to the Object.create() method along with additional properties declaration if needed. For example, const provider = { sum: (a, b) => a + b }; const obj = Object.create( provider