Subscribe

Get the Network Administrators tool pack

Subscribe to our newsletter and get 11 free network administrator tools, plus a 30 page user guide so you can get the most out of them.

Click Here to get your free tools

Recent Posts

Search

Archives

.NET will be the downfall of Microsoft Windows

by Steve Wiseman on April 5, 2007 · 0 comments

in Windows


.

Normally I would not post about development issues here, but while taking a break from working on the latest version of our remote control software I began daydreaming a theory about .NET and its impact on Windows.

We are avoiding .NET like the plague. This is because we want our executables to be small with little or no dependencies. Why? This means our customers will have fewer problems and our software will work on almost every version of Windows in use today (From 95 all the way through Vista when we are finished with this next version).

Another reason is performance. There is simply no way .NET could ever possibly grab the screen fast enough and send it back through the network.

We want to produce a 64 Bit version. Currently we are using Borland Delphi to develop all of our products. To fulfill this mandate we need a 64 bit compiler. The development road map for a 64 bit native Delphi compiler is pegged to be released in the year 2010. Way too long for us to wait.

To solve this issue we are moving certain vital parts of the server side to C++. In this transition we are coming to a sobering conclusion: C++ is discouraged at every turn. Inside and outside of Microsoft.

You will never see anything from MS discouraging the use of C++ (You certainly find it outside!). But little subtleties…like you Google info for a specific API example, and that document is gone. The only info you can find is for .NET – This happens more and more every day. I think eventually it will be very hard to find those C++ examples.

Lets take a moment and summarize what is going on here. Native code is being thrown away in favor of interpreted/virtual machine code. What does that mean for you? More slowness and churn. Why? Native code will *always* be faster than interpreted code.

Now you say…Wait a minute Steve. Hold on one cotton picken minute. This native code argument is crap because processors are so fast now. We have gobs of memory. A little processor speed can be given up in favor of ease of use for developers. Right?

I say no. As time goes by more and more developers will find it difficult, if not impossible to develop native code. Only a select few software development firms will stay the native code course. This means the majority of Windows applications will be interpreted, memory eating, bloated, CPU hogging beasts.

You say again…Steve! You haven’t addressed what I said. THIS WONT MATTER SINCE ALL OF OUR COMPUTERS ARE SO POWERFUL NOW.

That is true if you have no point of reference. It all comes crashing down when the same program is written for Mac OS X and under the same exact hardware runs circles around the Windows version.

You see Apple has an easy to use development language (Please debate this another day!) called Cocoa. It still compiles to native code. This means anything developed this way will *always* have an advantage over the .NET/Java style of execution. It is the way Apple has told the public to develop applications for OS X.

This seems minor. But the effect is mathematical. Lets say a .NET version takes 5 seconds longer to load. If you have 20 .NET processes loading at startup it means Windows will take 100 seconds longer to finish than the equivalent processes under Mac OS X. This means from this point on Windows will always be at a competitive speed and memory disadvantage when compared to OS X, or even Linux

I heard someone in the back saying they just go get their coffee while waiting for their machine to start up so they don’t care about that. What about the churn. The maddening churn. The churn that makes you want to smash your computer into 1000 pieces. That dreaded clicking of the hard drive when you do any little thing (Since all of your memory has been exhausted and the system has to swap like crazy)

Lets say a .NET application takes 5MB more per process than a native one. (This is a very conservative estimate!) If a user has 20 of these running it will require 100 more MB to handle the same exact setup within OS X or Linux. Even at 2GB I find my machine churning quite often. With the same hardware and same memory it almost never happens under OS X (I would site Linux too…but don’t run it on any of my machines right now)

This problem can be directly seen in Vista. Look at the Windows Photo Gallery program. I am almost positive the thing is written in .NET. It performs horribly on what I would call a very powerful machine (2GB of ram with a Core Duo processor). This machine happens to be a Mac Book. A similar application in OS X runs circles around it. Why? Because that OS X application is written in native code.

People will start to notice this. I was never a fan of Apple, or OS X, but 5 second boot times and very little churn have made me use it more and more. I find using Vista agonizing as it sometimes ‘thinks’ about things forever.

Wait until almost all Windows apps are written in .NET….boy will Windows scream then! Whooo Hooo!

Don’t get me wrong. I am no apple fanboy or anything like that….it is just that I see this insidious ticking time bomb in the .NET way of development. When Microsoft realizes the mistake it made it will be too late. Not that it matters much for them since they have so much cash stockpiled – it would take a decade or more of hemorrhaging cash before it even became a problem.

It matters for you and I. We have to deal with the fact that our brand stinking new 2GB of ram dual core machine now runs slower than a machine we bought 10 years ago.

You want to know how I started using OS X? I noticed that when I was in a hurry I could boot up and get into a web browser 10 times faster using it. Eventually I started doing all of my work there. Even windows development! Since OS X has less overhead I can easily run what I need in a parallels virtual machine and get my work done twice as fast. Still I miss having two mouse buttons 🙁

I bet if I loaded Linux on this thing I would find the exact same to be true. So you see. It has already started. Little ole me – Microsoft fan boy – switched simply because the pain of Windows churn (.NET induced I theorize) was so great it forced me to something else. That pain will cost Microsoft.

Microsoft didn’t have to choose this path. They could have easily allowed .NET to compile native code. But they didn’t. Delphi is a perfect example of how do exactly that. My insane theory why they didn’t is laughed off by most. Microsoft feared the transition between 32 bit and 64 bit hardware.

They felt it could be an opportunity for a competitor to dethrone them. Making .NET an interpreted language makes this transition very smooth. Why? Because Microsoft just needs to upgrade the underlying virtual machine and the application is none the wiser.

A developer can move to the 64 bit platform in .NET with very few changes (If any at all). If most Windows applications were still native code it would require the application vendor to release a new version for each 64 bit platform (At that time 64 bit AMD was *not* compatible with 64 bit Intel). Without this easy transition it would take much longer for users to switch to 64 bit.

So they traded off speed in favor of continued dominance in the industry…only to doom themselves later on!

So that’s my theory and I am sticking to it. Only time will tell, but I bet we will still be talking about how long it takes for applications to load 5 years from now. That problem won’t go away because we have faster machines.

One more thing…Subscribe to my newsletter and get 11 free network administrator tools, plus a 30 page user guide so you can get the most out of them. Click Here to get your free tools

Related Articles:

Leave a Comment

Category Links - Windows Forum - Exchange Forum