Is cloud computing the future of the computer?
Cloud computing does not exist, it's something like cyber security.
What does exists are some common used frameworks or platforms like AWS, Azure, GCP, etcetera. They make it possible to use compute and data through commands without having a sense of using specific hardware.
Virtualization (started in the late sixties of the 20th century) made it possible use divide hardware servers in logical virtual servers. This made it possible to move a virtual server to one piece of hardware to another, of make copies of it. Still it was cumbersome and slow.
Since 2013 containers were introduced through software called "Docker". These containers which are lighter than virtual servers and made it possible to run software packages that are easily moved between hardware.
It's part of a bigger trend that's called "software defined".
Nowadays it's possible to remove the whole sense of servers through serverless computing. Like cloud computing it doesn't exist but it's a practice to run software without having to manage any servers. Some centralized provider does that for you. The good part is that you only have to run your code on a huge scale and only paying for what you use.
But even serverless does not remove all problems with building scalable reliable software. So a combination of all this things is used.
Now, for your question. Is cloud computing the future? That depend on a lot of things. Cloud computing often means that you are outsourcing stuff to trusted external providers who have datacenters around the world. It often means a vendor lock-in.
What computing model will be dominant in the future is unsure. It will certainly have some aspects from cloud computing.
Personally I believe in AI Computing. You just declare to some assistant what you want and it will be coded and deployed on the spot without you seeing how or where it runs, relying on best patterns optimized by machine learning using some intelligent database system built in a software language we don't really understand anymore. But that will take at least a couple of decades. But it will happen if we don't get some meltdown first.
In some aspects its a good thing, in others not so much.
"Cloud" computing is actually nothing new. In fact it's a reversion back to how it was done in the 1950s / 60s. I.e. the user was sitting in one city with a dumb terminal - at the time just a keyboard, printer and modem. Sending instructions to a computer somewhere else, perhaps even the other side of the continent, over a telephone line. The computer would perform the actions then send back the results so the printer can print them out for the user.
All that's changed is that the new user interfaces since then have been incorporated. E.g. instead of the printer it's now on the screen. And that instead of a telephone line it does the send/receive across a network - usually the internet. Not incredibly "new", just making use of other improvements as well. There's still one or more servers to which the user communicates the requests for actions, and then those same results are still sent back to the user's terminal (be that a phone / laptop / whatever).
One aspect of "cloud" computing is that a supplier can now charge per use much more easily than selling a program to someone. I.e. it can be much more profitable. While also meaning they only need to support a single machine instead of making sure it runs on various possible user computers. This is the major reason it's pushed so much in marketing - more profits and cheaper support.
Then again it means the user doesn't need to lug about a machine capable of performing the actions needed. Though this is ever less of concern as machines with similar capabilities are already much smaller than a few years ago. E.g. most phones these days are 1000s of times more capable than a mainframe of the 70s, or even a workstation of the 90s. So if this is truly a benefit depends on use. They still need some thing to interact with that remote server, and likely the thing they use for this could just as easily be used to perform the actions as well - no need for remote controlling a server somewhere.
Other pros around cloud include things like always up to date, easier to ensure it's compatible with hardware (since there's just the one hardware set), central point of support, quicker and easier upgrades, no more need to support various versions, easier to collaborate with several users on the same data, etc.
Though there's also negatives, e.g.: Need to be online for any form of use (off line effectively negates cloud). Will always be slower due to latencies and speed barriers of network. Availability dependent on connection. Reliance on a 3rd party for data integrity / program upkeep, and reliance on that 3rd party for security, availability, etc. Usually costing more over the long term as it tends to be charged per use or per time period instead of a purchase once off.
I don't think it would supplant normal local computing, though it will always be some part of the future. There are some tasks which don't lend themselves well to cloud based, while others make more sense on a "cloud". The best that could happen is some hybrid between the two ... wait ... that's what we have already!
Actually I think the easier split for your question is to say
- Is cloud computing the future of the personal computer?
- Is cloud computing the future of server based computing?
THe answer to 2 is yes, cloud represents the future of data centers and server based computing.
For number 1 the initial answer is no, but note, with the growing VDI (virtual desktop_ reality, we now have more capabilities leveraging cloud computing to replace desktops and laptops with tablets and cell phones. So the easy answer to 1 is no, the longer answer is maybe.