SCC: Virtualization — Take It To The Next Level

Super Protocol
3 min readOct 20, 2022

--

This might be an unpopular opinion in Web3 discourse, but we have to admit that cloud services are not just a “necessary evil” that we have to tolerate until decentralized networks finally win — it paved the way for the current web and all the services that millions of people use daily, including the ones we’re using to build the future web.

Today you can sign up for an AWS or Heroku account, upload your code, configure your “machine” by picking options from a dropdown menu, launch your app and start serving your first users and customers. The magic and simplicity of this are definitely underappreciated. So let’s dive a bit deeper into what happens here. At some point in the past, you had to be your own host: maintain a rack of hardware that would be your server, ensure its accessibility and keep it live 24/7, and manage low-level software and updates.

If there were a power outage, you’d be on your own. If you need more RAM, CPU, or storage space, you’d have to actually buy and install new hardware, along with updating the system so it could play along with new toys. Building complex applications able to serve millions of users would be almost impossible.

When cloud services such as AWS first appeared — they solved this problem by decoupling the tasks of maintaining a hardware server with some infrastructural software on top and running applications on top of it.

This allowed developers to (a) focus less on the infrastructural problems and more on the actual product; (b) lower the upfront costs of building a web service, thus removing the entry barrier. What happened next is you had all kinds of people trying to build and host dynamic applications that could do all sorts of things instead of the previous era static websites.

By the way, we’re living through something similar in Web3 right now. Several bold teams are building the infrastructure so that others can create and run decentralized applications without thinking about computational / storage resources, throughput, tokenomics, and security. Super Protocol included.

Back to the topic, how did the first cloud service providers do the decoupling thing? They “virtualized” the server, so instead of the real hardware, you have its image with similar properties, yet much easier to maintain. The virtualization software layer allows working with the infrastructure consisting of discrete hardware as though you have an infinite supply of resources that you can split in any way necessary.

As mentioned above, Web3 is on its way to the same levels of security and convenience. You still have to manage your nodes, contracts, integrations, transactions, and data sources. The level up that could boost adoption would be a new level of abstraction that provides new tools for the next generation of builders. If a developer could just configure all of the decentralized infrastructures using a convenient interface — that would free up a sizable chunk of time they could dedicate to developing and promoting their products.

Moving technology forward is almost always done “on top” of something, be it previous research, prototype, or infrastructure. In our case, Super Protocol takes best practices from the Web2 cloud to provide convenience and security for the next giant leap.

--

--

Super Protocol

Super Protocol is for those who need decentralized, permissionless, trustless and easily scalable computing resources.