Pages

Tuesday, 19 February 2013

Is hardware becoming throwaway to the extreme?

I've used Apple hardware since the mid 1990's and have seen numerous different itterations including 3 major processor changes. As a result I've become quite accustomed to managing desktop Mac's when they inevitably come to the end of their life.

I would also say that in my experience, most people who regularly use a desktop machine for both work and leisure have a reasonably sound knowledge of how to repair or at the least diagnose things when they go wrong. This has always been a helpful skill to have in my opinion. Over the past few years however, hardware has become significantly closed to the point of rendering DIY maintainance almost impossible.

This all seems to have started with laptop computers which, by their very nature, required smaller more intricate designs. This is perhaps the best example of a modern computer designed to the point that it hinders the kind of DIY skills I mention above. In particular, the screen element of laptops are one of the most difficult to replace should they become damaged or fail. Its worth adding that access to motherboard related components is also just as difficult.

If we move on a few years, we see the first popular all-in-one modern computers arrive in the form of Apple iMac's. While these were groundbreaking at the time in terms of design asthetics, they had started to introduce the same closed hardware problems that laptops had first introduced.

If we fast-forward to today, laptops, smart phones and all-in-one desktop PC's (Apple or other) have massively reduced the ability for a user to self-maintain their hardware. Apple unfortunately seems to be the biggest culprit, with a high proportion of their product line now almost entirely closed leaving expensive 'care' options or approved dealer repairs as the only real option left on the table.

Currently the low-end iMac is one of the best examples of this problem:

As iFixit says:
To save space and eliminate the gap between the glass and the pixels, Apple opted to fuse the front glass and the LCD. This means that if you want to replace one, you'll have to replace both.
The cost is quickly apparent: cutting open the display destroys the foam adhesive securing it shut. Putting things back together will require peeling off and replacing all of the original adhesive, which will be a major pain for repairers.

Lets quickly look at the Pro's/Cons from an end users perspective.

Closed hardware

Pros

  • Can use the latest software
  • Can make use of the latest hardware features
  • Setup probably easier than older hardware
  • Increase chance of compatibility with new market products/systems

Cons

  • Increase in unwanted hardware
  • More expense for smaller users
  • Increase in issues of moving data from old to new systems
  • Cannot swap out old hardware components to save money
  • Increase in voiding warranty
  • Increase chance of incompatibility with legacy products/systems

My concern is this. For a huge company of hundreds of employees, an IT department can swap in/out hardware and maintain integrity of the data that employees create/use within the business. That model works.

However, a home user or small business doesn't necessarily have that luxury. They may only have one computer, or a less than adequate data integrity solution.
Ultimtely, I worry that this will continue to raise the level of expense that a small IT budget has to bear, while at the same time doing nothing to stem a throwaway society which only really benefits big business.

Also, with the advent of cloud technology, are we moving back to the world of terminals and mainframes with not only our maintainability removed but our entire computer system as well?

No comments:

Post a Comment