From Unix: A History and a Memoir by Brian Kernighan
As an example of how computing hardware has become cheaper and more powerful over the years, a 1978 PWB paper by Ted Dolotta and Mashey described the development environment, which supported over a thousand users: "By most measures, it is the largest known Unix installation in the world." It ran on a network of 7 PDP-11's with a total of 3.3 megabytes of primary memory and 2 gigabytes of disk. That's about one thousandth of a typical laptop today. Would your laptop support a population of a million users?
This reminds me of an observation, I believe by John Carmack on Twitter although I can't dig up the link now, about how frustrating it is writing software for mobile phones today. They have orders of magnitude more raw power than the machines he was programming for in the Doom and Quake days, but in those days he was much closer to the bare metal. All the intermediary layers eat up all the performance gains in hardware and, paraphrasing, he found himself battling resources constraints as much as ever.
Don't get me wrong: modern languages with conveniences like garbage collection, and running inside a bytecode VM like Java have huge upsides. But raw performance is not one of them. We have made huge advances in raw computing power, but much of that benefit has accrued to the developer instead of the user, because they don't have to be as skilled or optimize their programs as much. This means development is a bigger tent than it once was. But it does explain why computers don't necessarily feel any faster than, say, 10 years ago.