Skip to main content

Advantages of Using Protocol Buffer

A protocol buffer is a mechanism to share objects between machines which is language agnostic and has a target to reduce the payload size. We are already common with JSON which is used by most RESTful APIs to send/receive objects to/from any kind of client. JSON is already convenient and supported by many platforms, but, why we should know about the protocol buffer.

Besides the optimization of payload encoding, protocol buffer which is also called protobuf introduces schema definition that should be maintained by the machines to encode or decode the objects delivered. The main processes for delivering the objects are called serialization and deserialization. Serialization is the process of transforming an object instance in an application into an optimized binary payload. Deserialization is the process of decoding the binary data into the desired object.

Let's take a look at the following table that shows a comparison of XML, JSON, and protobuf.


XML JSON protobuf
Readability Normal High Low
Strictness Low Normal High
Size Efficiency Low Normal High

JSON is good to be debug and read by humans. If our object is mainly intended to be processed by machines, readability is not the focus.

JSON only support three basic types which are boolean, number, and string. Meanwhile, protobuf defines more data types so that any platforms or programming languages can decode the object into the desired type automatically. Besides, the strictness of the data type can be maintained among machines because protobuf requires all machines or applications to maintain the data or message schema with a set of rules and options.

Unlike XML or JSON which maintain the object schema in the payload explicitly and deliver the object as plain text, protobuf encodes the object into optimized binary with certain tags to describe the object structure. While the complete object schema is maintained in every machine. This leads to the payload size reduction.

There is also a CLI tool called protoc that can help us perform the encoding and decoding of data. Besides, this tool can also generate the class to instantiate the object defined in the protobuf schema for various programming languages.

Another benefit of protobuf, it standardises a mechanism so that it can handle both backward and forward compatibility. Backward compatibility means that a machine can still process the object delivered by another machine with an earlier version. Forward compatibility means it can handle the object delivered by a machine with a later version. This can be achieved inheritely by the concept of default value and reserved field embedded in protobuf.

As a reminder, protobuf doesn't handle the communication process. This is handled by another framework like gRPC. However, protobuf can define a service block in the schema to describe how a service will receive the request message and send the response message.


Comments

  1. Quietly, the Kink Test platform impressed me with its clarity and accuracy. Its interactive questions, reliable results, and intuitive interface create a safe space for self-reflection, helping users gain insights into personal preferences while enjoying an engaging and judgment-free experience.

    ReplyDelete
  2. Incredible convenience is provided by Pic to Text, offering fast, accurate, and reliable image-to-text conversions, intuitive navigation, and seamless performance, making complex tasks effortless and enjoyable, empowering students, professionals, and creatives to handle content efficiently and professionally every single day.

    ReplyDelete
  3. Surprisingly, I found port checker to be a highly effective network tool. Its speed, precision, and intuitive interface make port testing effortless, providing instant results while reducing frustration and ensuring professionals and novices alike can manage connectivity tasks confidently and efficiently.

    ReplyDelete

Post a Comment

Popular posts from this blog

Deploying a Web Server on UpCloud using Terraform Modules

In my earlier post , I shared an example of deploying UpCloud infrastructure using Terraform from scratch. In this post, I want to share how to deploy the infrastructure using available Terraform modules to speed up the set-up process, especially for common use cases like preparing a web server. For instance, our need is to deploy a website with some conditions as follows. The website can be accessed through HTTPS. If the request is HTTP, it will be redirected to HTTPS. There are 2 domains, web1.yourdomain.com and web2.yourdomain.com . But, users should be redirected to "web2" if they are visiting "web1". There are 4 main modules that we need to set up the environment. Private network. It allows the load balancer to connect with the server and pass the traffic. Server. It is used to host the website. Load balancer. It includes backend and frontend configuration. Dynamic certificate. It is requ...

Armin and Eren VS Colossal Titan

The trick was unexpected and caught Bertolt off guard.

Installing VSCode Server Manually on Ubuntu

I've ever gotten stuck on updating the VSCode server on my remote server because of an unstable connection between my remote server and visualstudio.com that host the updated server source codes. The download and update process failed over and over so I couldn't remotely access my remote files through VSCode. The solution is by downloading the server source codes through a host with a stable connection which in my case I downloaded from a cloud VPS server. Then I transfer the downloaded source codes as a compressed file to my remote server through SCP. Once the file had been on my remote sever, I extracted them and align the configuration. The more detailed steps are as follows. First, we should get the commit ID of our current VSCode application by clicking on the About option on the Help menu. The commit ID is a hexadecimal number like  92da9481c0904c6adfe372c12da3b7748d74bdcb . Then we can download the compressed server source codes as a single file from the host. ...

How To Verify Phone Number for Free Using WhatsApp

If you have a product or business that maintains user information like phone numbers, verifying the validity or ownership of the phone number could become important, as the phone number can be used as an authentication method or targeted marketing channel. The typical phone verification procedure is by generating a code or OTP in our application, sending that OTP to the user's phone, and then the user should insert the OTP in our application for verification. The OTP can be sent to the users through services like SMS or WhatsApp that require a valid phone number. For internet-based communication, WhatsApp has become the de facto standard for sending the OTP. WhatsApp requires its users to have a valid phone number during account creation, and it already has a huge number of users, approximately 3 billion in 2025. Using that common procedure, WhatsApp will charge us for each OTP sent. The cost depends on the country of the target phone number. For Indonesia...

What's Good About Strapi, a Headless CMS

Recently, I've been revisiting Strapi as a solution for building backend systems. I still think this headless CMS can be quite useful in certain cases, especially for faster prototyping or creating common websites like company profiles or e-commerce platforms . It might even have the potential to handle more complex systems. With the release of version 5, I'm curious to know what updates it brings. Strapi has launched a new documentation page, and it already feels like an improvement in navigation and content structure compared to the previous version. That said, there's still room for improvement, particularly when it comes to use cases and best practices for working with Strapi. In my opinion, Strapi stands out with some compelling features that could catch developers' attention. I believe three key aspects of Strapi offer notable advantages. First, the content-type builder feature lets us design the data structure of an entity or database model , including ...

Increase of Malicious Activities and Implementation of reCaptcha

In recent time, I've seen the increase of malicious activities such as login attempts or phishing emails to some accounts I manage. Let me list some of them and the actions taken. SSH Access Attempts This happened on a server that host a Gitlab server. Because of this case, I started to limit the incoming traffic to the server using internal and cloud firewall provided by the cloud provider. I limit the exposed ports, connected network interfaces, and allowed protocols. Phishing Attempts This typically happened through email and messaging platform such as Whatsapp and Facebook Page messaging. The malicious actors tried to share a suspicious link lured as invoice, support ticket, or something else. Malicious links shared Spammy Bot The actors leverage one of public endpoint on my website to send emails. Actually, the emails won't be forwarded anywhere except to my own email so this just full my inbox. This bot is quite active, but I'm still not sure what...