You should have looked at a regular Macbook. The Pro model is more expensive. But even there - I have a cheap laptop. It is a Windows PC. I only use it when traveling. I can afford to loose it and it does all I need a computer to do when I'm traveling.Perhaps, all I know is a few weeks ago I priced out the MacBook Pro based on very similar systems from other markets and they still had a larger cost then what was out there. So my price comment still stands.
If you compared processors, memory and other aspects and found a cheaper machine, more power to you.
Research the whole history, or read some of Robert Cringley's discussions from the time, or see Triumph of the Nerds (1996).I doubt it considering Apple was still a relatively small company ( 125,000 Apple II's sold by the time IBM started development on the 5150) compared to all the other PC markets at the time the TRS-80 alone sold 250,000 units in the same time period.
That's why I pointed out IBM's deviation from their standard of the time of using all proprietary components and cut their 6 year design and development cycle for their first PC to just about a year.
You mention the IBM 5150, and as you Google around you will see that there are a lot of 'changes' in the way it was presented, but the upshot was IBM came to the realization that home/small business was not in their business model and finally understood it was the future. They only made big iron with dumb terminals at the time. Most of what you will get Googling these days are essentially old advertisements, such as one IBM put out where they say stuff like "...When the IBM Personal Computer (IBM 5150) was introduced to the world 25 years ago, it was dramatically clear to most observers that IBM had done something very new and different...." But look at the IBM 5150 and look at an
Apple_II_series from 1977 (four years before the 5150 was even brought onto the scene).
Compare the capabilities of the Apple II of 1977 vs. the IBM 5150 of 1981. There was a world of difference.
Apple's 'cardinal sin' was that they did have 'non-standard' interfaces, but going back to the 1970's there were few standards to begin with outside individual companies and companies which were involved with the US's Advanced Research Projects Agency, known as ARPA.
Woz essentially developed the Apple through interacting with people in the the old
Homebrew_Computer_Club. As Wikipedia says: "The 1999 made-for-television movie Pirates of Silicon Valley (and the book on which it is based, Fire in the Valley: The Making of the Personal Computer) describes the role the Homebrew Computer Club played in creating the first personal computers." I'm not a big fan of the movie, but it's OK. Apple was started to sell computers to 'everyman'. Point is, Steve Jobs was a marketer, Woz was the brain. Jobs realized the potential and Apple was born. IBM was not doing anything new. Oh, sure, the advertising says they did but that's advertising.
The only thing about 'standardization' was they held such a big share of the business market they wanted a machine which could interface with their existing equipment. They did not do it for the good of the computing world. Now, I'm not saying what they did was 'bad' or anything. That's business.
Where IBM screwed up was by using off the shelf parts to make their first machine. I can't remember the consultant's name that IBM called in, but it was he who convinced them that the only way to get out a "PC" in under a year was to do essentially what the Home Brew people did - Buying off the shelf parts. Had IBM taken another year, IBM would have had the Windows PC market sewn up. By using off the shelf parts, Compaq quickly reverse engineered the IBM PC and, of course, made a deal with Microsoft (licensing) for the OS. From there on, clones popped up everywhere. Had IBM stayed with their proprietary approach they would have had a captive market and it is probable that other than through licensing agreements there would be no clones (such as Dell and all the others that came {not to mention went}) today.
If you want to read a good book on big iron development back around the time, read
The_Soul_of_a_New_Machine (which was from experiences at DEC for those of you who remember
Digital_Equipment_Corporation). I visited the DEC 'home' in Maynard, Massachusetts back around 1980, and it was thrilling.You can go back to the Z80 and the TRS-80, but that was, as with the Apple, the start of home computers that most people other than the hobbyist could actually do anything with.
Now, you can look back and try to compare IBM to Apple in the 1977 to 1981 time frame, but it's a hard comparison to make. IBM was in just about every significant business in the world at the time. Jobs and Woz were working out of a garage. Going up against a goliath like IBM was quite a task. That Apple has done so well, considering the long time disadvantage, it has done a rather impressive job, I'd say. And today they have reached a zenith where a Mac will run Windows, Linux, *BSD, etc. and their associated programs. Think about it. You decide to make a new type of soap. You may end up with a successful business, but what are the chances you will be able to compete with regard to size with Procter & Gamble. It was similar with apple. Two guys in a garage vs. IBM and all it's R&D, it's history, its market penetration into BIG businesses.
As to price, again, one should buy what one needs (or think they need). This is why 'netbooks' are somewhat popular these days. Typically small screens with slow processors, they're made for people who don't need a lot of power (which few people *really* need). For most people a computer is used for web browsing, email and sometimes writing a letter or running a simple spreadsheet. Admittedly computers are getting not only more powerful, they are also becoming more widespread and people are starting to do some stuff like home video (where the more 'power' the better) which they couldn't before. I can only say anecdotally, but I really know very few people with PCs (Macs or Windows) who do more than browse the web or use email (or facebook or things like that). A netbook or cheap computer will do all they need, so there is a valid argument of "why buy more than you need".
Macs aren't for everyone. And neither are iPhones and iPods (I don't own either). But they have their place.
One other anecdotal aspect I will bring up is that of everyone I have turned onto Macs none has ever called with problems. As to Windows PCs, I even cut my family off in the late 1990's as to 'fixing' their computers when they messed up. It was always a nightmare. The software didn't support a sound card, or the video card or something really stupid. Of course I was told Apple would soon be out of business and what a fool I was for buying Macs.
When my daughter was born in 1988 it only took about a year before I had to get her a computer (well, her and her mother) to keep them off my Mac. I bought her a Windows PC because her mother wanted one (so she could bring home Lotus files and work on stuff at home). Her mother also pointed out how much cheaper the Windows PCs were compared to my Mac. It was a nightmare. First it was "Your Mac has sound built in. All we have is the cheap internal speaker'." So, it was off to buy a sound card. Then we found some programs would work with that specific sound card and some wouldn't. Then came more ram. I think there was then a video card. In the end that 'cheap Windows PC' cost more than my Mac. And getting everything to work together was, well - A trying experience, to say the least.
Yes, things have changed, but the same applies - Buy what you think you will need. As much as I would *love* to buy a maxed out Mac tower, I couldn't use all that power unless I got seriously into video. A maxed out tower will set you back around US$10K.
