I hate Microsoft!!

*cough* buy a Mac *cough*

NEVER!... Get thee behind me...

Like I said I'm naturally penurious, I can put together several, (typically) better performing, utterly reliable, PCs for the price of an Mac Apple.

The early AMD X2s were little heaters, but a good heat pipe heat sink would tame them (never got much of an overclock from them though). My last one ended up in the HTPC, it is very happy playing movies/music (W7 64 ultimate).

I'm a contrarian; ...I like Microsoft operating systems (mostly)...

Enjoy!
 
Well, I guess you guys got lucky. My Vista experience was bad. Bad, bad, bad. I hated that damn thing. My computer was slow as hell, programs would crash far too often, it just had way too many bugs. And let's face it, MS knew it sucked. That's why they worked overtime to release Win7 so soon after Vista.

By the way, I think your experience was better because of the version you had. Lenny, were you using 64-bit, as well? I was running plain old vanilla Home Premium, but I'd heard from friends who were running the 64-bit Vista that it was a lot more stable. Though I'm not sure why that would be the case...


Are you sure you could do that in Vista? I might be wrong, but I seem to remember that one of the gripes about UAC was that you couldn't turn it off, which they fixed in Win7.

You probably installed on a computer that couldn't handle it. I didn't have any problem with mine. I'm never getting a Mac because Macs don't have very many programming languages that can be used. The other operating systems aren't any more secure than Windows. They aren't even really a target so claiming that you're more secure is actaully puting you at risk. Another thing is that Apple computers are so expensive. I can't justify spending over a thousand dollars for an entry level PC. I know because I checked. I find that Mac and Linux loves don't really know that much about security and what the bad guys are capable of. The reason Windows Vista wasn't working well was because the poeple downloading it weren't insalling it on computers that could hanlde it, not installing the Graphics Media Accelerator Driver, or the device companies weren't coming out with the Device Drivers quickly enough. There can also be combination of the list. It's not luck it's downloading Windows Vista properly. One must check the recogmended system reqiremnts not the minimum. I've never had a problem with it. Why don't you go build you're own operating system and find out how hard it is? That's what I see the companers that sit around complaining about every little thing they don't like. I installed a game on a computer that met the minimum system requirements and it would crash ever few minutes. It wasn't the games problem. It was because the computer was barely fast enough to play it. I get tired you all bashing perfectly legitimate companies. I tried installing Linux one time and the installing came up with an error. All of Microsoft's software has worked flawlessly for me. It can work that way for you, but you have to make sure you're computer is capable of running it first of all. I saw a screenshot of Linux and it looks like an amature did it. Mac looks much better. You're listening to liars Devi's Advocate. There's a way turn off the UAC, but it's there for a security reason. What it does is it prevents anything from running without permission that would affect other users. If it's set up properly only programs that don't affect other users can be ran with administrator priviledges. See that addressed a security issue. It makes me so upset when I find a product that at has at least one advertised feature that doesn't work.
 
Last edited:
Let me first say I do not dislike Microsoft - people forget so quickly what things were like before they came up with what is effectively an industry standard OS. It may not be the best but the fact that nearly everyone uses it means an awful lot.

For those that don't/can't remember back that far: before the PC and DOS came along every computer was different (typically even between different computers from the same manufacturer). Consequently if you wrote an application it would only run on the one target computer system. Consequently something like a Word Processor (if you could really call them that back then) cost in the order of £20k - I kid you not. Anyone want to go back to that. Not to mention that changing computers required a one week course on the new system.

When the PC came out you could write a single application that had a potential marketplace big enough to offer the software at sensible prices. With the advent of Windows everyone got used to a particular style of interface, to the extent that when you got a new piece of software you just started using it; no need to spend a week reading the operators manual. Admittedly with some of MS's new interfaces it now seems we are going back to square one again, which I hate. So it may not be the best but MS gave us a standard that made all computer work easier from that time on.

Now my grouch. I have just gone to Windows 7 and I hate it with a vengence. It may be more stable than Vista (I managed to skip Vista altogether) but it is far, far, far less stable that XP. With my old XP machine I almost never had lock ups requiring the power switch to kill it (maybe once or twice in a year) I have had at least a dozen in the two and half months I have W7. Half the time when I launch a new program it does not load on top of other applications but behind them and I have to bring it forward. I've had to switch to Live Mail whose address search combo goes bonkers when it has drilled down to two options. I'm going to have to replace the Windows jpeg quick viewer thing as I do a lot of work with photos and now each time I open a photo to look at it it comes up in a new instance of the software (and MS say that can't be changed) and within a short while I discover I have about 50 open coies of the Windows Image viewer. The list goes on; mostly small irritants but irritants that weren't present in XP.

In fairness I am a software developer and that does generally give OS's a hard time - testing your own software that contains bugs that mess up the OS. But it is all much worse than when I was on XP. Then again I have moved from a laptop with a single processor and 2G of memory to a laptop with 8 processors with 8G of memory running 64bit W7 and I can barely notice any improvement in general usage speed.
 
Let me first say I do not dislike Microsoft - people forget so quickly what things were like before they came up with what is effectively an industry standard OS. It may not be the best but the fact that nearly everyone uses it means an awful lot.

For those that don't/can't remember back that far: before the PC and DOS came along every computer was different (typically even between different computers from the same manufacturer). Consequently if you wrote an application it would only run on the one target computer system. Consequently something like a Word Processor (if you could really call them that back then) cost in the order of £20k - I kid you not. Anyone want to go back to that. Not to mention that changing computers required a one week course on the new system.

When the PC came out you could write a single application that had a potential marketplace big enough to offer the software at sensible prices. With the advent of Windows everyone got used to a particular style of interface, to the extent that when you got a new piece of software you just started using it; no need to spend a week reading the operators manual. Admittedly with some of MS's new interfaces it now seems we are going back to square one again, which I hate. So it may not be the best but MS gave us a standard that made all computer work easier from that time on.

Now my grouch. I have just gone to Windows 7 and I hate it with a vengence. It may be more stable than Vista (I managed to skip Vista altogether) but it is far, far, far less stable that XP. With my old XP machine I almost never had lock ups requiring the power switch to kill it (maybe once or twice in a year) I have had at least a dozen in the two and half months I have W7. Half the time when I launch a new program it does not load on top of other applications but behind them and I have to bring it forward. I've had to switch to Live Mail whose address search combo goes bonkers when it has drilled down to two options. I'm going to have to replace the Windows jpeg quick viewer thing as I do a lot of work with photos and now each time I open a photo to look at it it comes up in a new instance of the software (and MS say that can't be changed) and within a short while I discover I have about 50 open coies of the Windows Image viewer. The list goes on; mostly small irritants but irritants that weren't present in XP.

In fairness I am a software developer and that does generally give OS's a hard time - testing your own software that contains bugs that mess up the OS. But it is all much worse than when I was on XP. Then again I have moved from a laptop with a single processor and 2G of memory to a laptop with 8 processors with 8G of memory running 64bit W7 and I can barely notice any improvement in general usage speed.

I don't know of a way to change that. You people keep blaming the wrong company. It's the fautl of the driver company for not hurrying and their drivers out. They took too long. Again I had a game that kept crashing and I thought it was a problem the game itself. It turned out I was running on computer that was just fast enough to run it. On my new computer it ran flawlessly. It's not about luck it's about knowing how to do it. Vertigo I believe you mean eight cores. Multicare processing doesn't make a difference unless multiple applications are ran or the program has been multithread. Multithreading is programming code splits the program into more than processes so it can be ran in it's each individual core.
 

Similar threads


Back
Top