Skip to main content

Posts

Invisible Closure Scope in Javascript

When we are maintaining variables in our Javascript code, we must already know about the scope that determines the visibility of variables. There are three types of scope which are block, function, and global scope. A variable defined inside a function is not visible (accessible) from outside the function. But, the variable is visible to any blocks or functions inside the function. When a function is created, it has access to variables in the parent scope and its own scope, and it is known as closure scope. For example, if we create a function (child) inside another function (parent), in the creation time the child function will also have access to variables declared in its parent. Another way to think of closure is that every function in JavaScript has a hidden property called "Scope", which contains a reference to the environment where the function was created. The environment consists of the local variables, parameters, and arguments that were available to the...

Threat Vectors in Cybersecurity

In cybersecurity, a threat is the potential occurrence of an undesirable event that can eventually damage or disrupt the operational and functional activities of a company or organization. Some examples are an attacker stealing sensitive data, infecting a system with malware, and data tampering. In order to realize their intentions, threats need vectors. A threat vector is a medium through which an attacker gains access to a system by exploiting identified vulnerabilities. Some most common threat vectors used by adversaries are as follows. Direct/physical access : By having direct access to our computing devices, the attacker can perform many malicious activities like installing malicious programs, copying a large amount of data, modifying device configuration, and so on. Protection : We should implement strict access control and restriction. Removable media : Devices like USB flash drives, smartphones, or IoT devices may contain malicious programs...

Utilise GraphQL and Apollo Client for Maintaining React State

One library that is quite popular to allow our application to interact with the GraphQL server is Apollo Client ( @apollo/client ). It has support for several popular client libraries including React. It also provides cache management functionality to improve the performance of the application when interacting with GraphQL. Rather than integrating another library to manage our application state, we can leverage what Apollo Client already has to maintain the state. The solution is achieved by creating a customised query to load the result from a local variable or storage. Then, the query is executed like other queries using useQuery() . The steps are as follows. Create a local query that will read data only from the cache. Create a cache that defines a procedure for the query to read data from local values. Call the query anytime we want to read the state value. For instance, we want to read the information of the current user that has successfully logged in...

Deploying Infrastructures Using Terraform on UpCloud

Terraform is a tool to help us deploy infrastructures on any cloud provider such as AWS, GCP, DigitalOcean, and many more. Unlike Amazon CloudFormation which is specific only for AWS, Terraform supports many cloud providers found in Terraform's registry. It uses a domain-specific language built clearly for provisioning and configuring infrastructures named HCL or HashiCorp Configuration Language. Meanwhile, UpCloud is an alternative cloud provider for SMEs. It targets a quite similar segment to DigialOcean and Linode. It provides a variety of popular solutions in the cloud such as managed Redis database, S3-compatible storage, private network, load balancer, and so on. Even though its cost is a little bit higher than DigitaOcean or others, it provides quite complete features on each service like the features of the load balancer that we will use in this post. Moreover, it actively publishes new features like the managed OpenSearch database published rece...

The Truth About Reiner and Bertolt

"As a warrior, no road left but the one that leads to the end."

Communicate Through RabbitMQ Using NodeJS and Fastify

If we have two or more services that need to talk to each other but it is allowed to be asynchronous, we can implement a queue system in our system using RabbitMQ. RabbitMQ server will maintain all queues and connections to all services connected to it. This post will utilize Fastify as a NodeJS framework to build our program. This framework is similar to Express but implements some unique features like a plugin concept and an improved request-respond handler. First, we need to create two plugins, one is for sending a message, another one is for consuming the sent message. At first, we will make it using a normal queue. There is one mechanism for how a queue works, it is like a queue in the real world. When there are five persons in a queue and three staff to handle the queue, one person is served only by one staff, there is no need for other staff to handle any person that has been served, and there is no need for a person to be handled repeatedly by other staffs. For example...

Is Data Important

Today, many systems may generate huge amounts of data such as system logs, financial transactions, customer profiles, security incidents, and so on. It is encouraged by the advancement of some technologies like IoT, mobile devices, and cloud computing. There are also fields that specifically learn to manage and process a lot of data like data science and machine learning. A set of data can be processed to produce certain results like detecting anomalies, predicting the future, or describing the state of a system. To generate such a result, the typical phases are collecting data, data preparation, visualization, and data analysis or generating results. In collecting data, we have to take some considerations including the location where the data will be stored, the type of stored data, and the retrieval method or how other systems can consume the data. When we want to select a location, we should consider whether the storage is available in the cloud or on-premise infrastructure, ...

Manually Select Private Key for Git CLI

We may utilize different keys for different projects or accounts. When we pull data from a Git repository through an SSH connection, by default the Git tool will follow the default SSH configuration for selecting the key used which is located in ~/.ssh/id_rsa . We can also set a custom SSH configuration located in the  ~/.ssh/config  file that will be followed by the Git tool too as explained in my other post . For setting the private key locally or per session, there are other options. First, we can utilize an environment variable that will be read by the Git tool for selecting the correct SSH command which is GIT_SSH_COMMAND . The usage is as follows. GIT_SSH_COMMAND="ssh -i ~/.ssh/your_id_rsa -F /dev/null" git clone git@github.com:your/project.git The -F /dev/null parameter is used for ignoring any available SSH configuration in the host. This method will apply the custom SSH command during the user session or it can be permanent too by setting it in the host envi...

Managing Password in Unix Using Pass

If you are looking for simple password management in Unix, pass maybe the answer. It utilizes GPG to encrypt the stored passwords. It stores the encrypted passwords as text files in a tree of directories. Each directory can maintain a separate GPG key for encrypting the passwords stored inside it. How easy is it? The following command shows how we can store a password and set AWS/access-key-id as the variable name to access it in the future. pass insert AWS/access-key-id The previous command will automatically create a directory named AWS inside the  ~/.password-store  directory which is the default location of pass storage. It also creates a file named access-key-id.gpg inside the ~/.password-store/AWS directory. To access the value we can call the following command. pass AWS/access-key-id There are some steps we need to run for utilizing the tool. Install pass using package manager Create a GPG key pair record Initialize the pass storage wit...

The First Time Kenshin Met Hiko

"Tell me your name" "Shinta" "Too soft for a swordsman, as of today you are Kenshin"

Essentials Ansible Modules

Ansible is a reliable configuration management tool. It is shipped with a lot of modules including those provided by the communities. Some modules are essential and come in very handy in everyday tasks. Ansible is pushed-based and works by generating a Python script that will be run on the target server. It means the target server is required to have Python which is also commonly shipped in any Linux distros. package The module is used to manage packages in the target host. It is like running apt , yum , or aptitude . The following snippet is an example of its usage to install the Nginx package using the package manager. tasks: - name: Install Nginx package: name: nginx state: present update_cache: True file It is used to manage files, symlinks, links, or folders on the target host. These are the two examples. tasks: - name: Create a directory file: path: "/home/luki/mydir" state: directory mode: 0750 - name: C...

Installing VSCode Server Manually on Ubuntu

I've ever gotten stuck on updating the VSCode server on my remote server because of an unstable connection between my remote server and visualstudio.com that host the updated server source codes. The download and update process failed over and over so I couldn't remotely access my remote files through VSCode. The solution is by downloading the server source codes through a host with a stable connection which in my case I downloaded from a cloud VPS server. Then I transfer the downloaded source codes as a compressed file to my remote server through SCP. Once the file had been on my remote sever, I extracted them and align the configuration. The more detailed steps are as follows. First, we should get the commit ID of our current VSCode application by clicking on the About option on the Help menu. The commit ID is a hexadecimal number like  92da9481c0904c6adfe372c12da3b7748d74bdcb . Then we can download the compressed server source codes as a single file from the host. ...