Tag Archives: IT

The IT dark age – The relapse

I long ago used a slide in my talks about the IT dark age, showing how we’d come through a period (early 90s)where engineers were in charge and it worked, into an era where accountants had got hold of it and were misusing it (mid 90s), followed by a terrible period where administrators discovered it and used it in the worst ways possible (late 90s, early 00s). After that dark age, we started to emerge into an age of IT enlightenment, where the dumbest of behaviors had hopefully been filtered out and we were starting to use it correctly and reap the benefits.

Well, we’ve gone into relapse. We have entered a period of uncertain duration where the hard-won wisdom we’d accumulated and handed down has been thrown in the bin by a new generation of engineers, accountants and administrators and some extraordinarily stupid decisions and system designs are once again being made. The new design process is apparently quite straightforward: What task are we trying to solve? How can we achieve this in the least effective, least secure, most time-consuming, most annoying, most customer loyalty destructive way possible? Now, how fast can we implement that? Get to it!

If aliens landed and looked at some of the recent ways we have started to use IT, they’d conclude that this was all a green conspiracy, designed to make everyone so anti-technology that we’d be happy to throw hundreds of years of progress away and go back to the 16th century. Given that they have been so successful in destroying so much of the environment under the banner of protecting it, there is sufficient evidence that greens really haven’t a clue what they are doing, but worse still, gullible political and business leaders will cheerfully do the exact opposite of what they want as long as the right doublespeak is used when they’re sold the policy.

The main Green laboratory in the UK is the previously nice seaside town of Brighton. Being an extreme socialist party, that one might think would be a binperson’s best friend, the Greens in charge nevertheless managed to force their binpeople to go on strike, making what ought to be an environmental paradise into a stinking litter-strewn cesspit for several weeks. They’ve also managed to create near-permanent traffic gridlock supposedly to maximise the amount of air pollution and CO2 they can get from the traffic.

More recently, they have decided to change their parking meters for the very latest IT. No longer do you have to reach into your pocket and push a few coins into a machine and carry a paper ticket all the way back to your car windscreen. Such a tedious process consumed up to a minute of your day. It simply had to be replaced with proper modern technology. There are loads of IT solutions to pick from, but the Greens apparently decided to go for the worst possible implementation, resulting in numerous press reports about how awful it is. IT should not be awful, it can and should be done in ways that are better in almost every way than old-fashioned systems. I rarely drive anyway and go to Brighton very rarely, but I am still annoyed at incompetent or deliberate misuse of IT.

If I were to go there by car, I’d also have to go via the Dartford Crossing, where again, inappropriate IT has been used incompetently to replace a tollbooth system that makes no economic sense in the first place. The government would be better off if it simply paid for it directly. Instead, each person using it is likely to be fined if they don’t know how it operates, and even if they do, they have to spend a lot more expensive time and effort to pay than before. Again, it is a severe abuse of IT, conferring a tiny benefit on a tiny group of people at the expense of significant extra load on very many people.

Another financial example is the migration to self-pay terminals in shops. In Stansted Airport’s W H Smith a couple of days ago, I sat watching a long queue of people taking forever to buy newspapers. Instead of a few seconds handing over a coin and walking out, it was taking a minute or more to read menus, choose which buttons to touch, inspecting papers to find barcodes, fumbling for credit cards, checking some more boxes, checking they hadn’t left their boarding pass or paper behind, and finally leaving. An assistant stood there idle, watching people struggle instead of serving them in a few seconds. I wanted a paper but the long queue was sufficient deterrent and they lost the sale. Who wins in such a situation? The staff who lost their jobs certainly didn’t. I as the customer had no paper to read so I didn’t win. I would be astonished with all the lost sales if W H Smith were better off so they didn’t win. The airport will likely make less from their take too. Even the terminal manufacturing industry only swaps one type of POS terminal for another with marginally different costs. I’m not knocking W H Smith, they are just another of loads of companies doing this now. But it isn’t progress, it is going backwards.

When I arrived at my hotel, another electronic terminal was replacing a check-in assistant with a check-in terminal usage assistant. He was very friendly and helpful, but check-in wasn’t any easier or faster for me, and the terminal design still needed him to be there too because like so many others, it was designed by people who have zero understanding of how other people actually do things.  Just like those ticket machines in rail stations that we all detest.

When I got to my room, the thermostat used a tiny LCD panel, with tiny meaningless symbols, with no backlight, in a dimly lit room, with black text on a dark green background. So even after searching for my reading glasses, since I hadn’t brought a torch with me, I couldn’t see a thing on it so I couldn’t use the air conditioning. An on/off switch and a simple wheel with temperature marked on it used to work perfectly fine. If it ain’t broke, don’t do your very best to totally wreck it.

These are just a few everyday examples, alongside other everyday IT abuses such as minute fonts and frequent use of meaningless icons instead of straightforward text. IT is wonderful. We can make devices with absolutely superb capability for very little cost. We can make lives happier, better, easier, healthier, more prosperous, even more environmentally friendly.

Why then are so many people so intent on using advanced IT to drag us back into another dark age?

 

 

Ultra-simple computing: Part 2

Chip technology

My everyday PC uses an Intel Core-I7 3770 processor running at 3.4GHz. It has 4 cores running 8 threads on 1.4 billion 22nm transistors on just 160mm^2 of chip. It has an NVIDIA GeForce GTX660 graphics card, and has 16GB of main memory. It is OK most of the time, but although the processor and memory utilisation rarely gets above 30%, its response is often far from instant.

Let me compare it briefly with my (subjectively at time of ownership) best ever computer, my Macintosh 2Fx, RIP, which I got in 1991, the computer on which I first documented both the active contact lens and text messaging and on which I suppose I also started this project. The Mac 2Fx ran a 68030 processor at 40MHz, with 273,000 transistors and 4MB of RAM, and an 80MB hard drive. Every computer I’ve used since then has given me extra function at the expense of lower performance, wasted time and frustration.

Although its OS is stored on a 128GB solid state disk, my current PC takes several seconds longer to boot than my Macintosh Fx did – it went from cold to fully operational in 14 seconds – yes, I timed it. On my PC today, clicking a browser icon to first page usually takes a few seconds. Clicking on a word document back then took a couple of seconds to open. It still does now. Both computers gave real time response to typing and both featured occasional unexplained delays. I didn’t have any need for a firewall or virus checkers back then, but now I run tedious maintenance routines a few times every week. (The only virus I had before 2000 was nVir, which came on the Mac2 system disks). I still don’t get many viruses, but the significant time I spend avoiding them has to be counted too.

Going back further still, to my first ever computer in 1981, it was an Apple 2, and only had 9000 transistors running at 2.5MHz, with a piddling 32kB of memory. The OS was tiny. Nevertheless, on it I wrote my own spreadsheet, graphics programs, lens design programs, and an assortment of missile, aerodynamic and electromagnetic simulations. Using the same transistors as the I7, you could make 1000 of these in a single square millimetre!

Of course some things are better now. My PC has amazing graphics and image processing capabilities, though I rarely make full use of them. My PC allows me to browse the net (and see video ads). If I don’t mind telling Google who I am I can also watch videos on YouTube, or I could tell the BBC or some other video provider who I am and watch theirs. I could theoretically play quite sophisticated computer games, but it is my work machine, so I don’t. I do use it as a music player or to show photos. But mostly, I use it to write, just like my Apple 2 and my Mac Fx. Subjectively, it is about the same speed for those tasks. Graphics and video are the main things that differ.

I’m not suggesting going back to an Apple 2 or even an Fx. However, using I7 chip tech, a 9000 transistor processor running 1360 times faster and taking up 1/1000th of a square millimetre would still let me write documents and simulations, but would be blazingly fast compared to my old Apple 2. I could fit another 150,000 of them on the same chip space as the I7. Or I could have 5128 Mac Fxs running at 85 times normal speed. Or you could have something like a Mac FX running 85 times faster than the original for a tiny fraction of the price. There are certainly a few promising trees in the forest that nobody seems to have barked up. As an interesting aside, that 22nm tech Apple 2 chip would only be ten times bigger than a skin cell, probably less now, since my PC is already several months old

At the very least, that really begs the question what all this extra processing is needed for and why there is still ever any noticeable delay doing anything in spite of it. Each of those earlier machines was perfectly adequate for everyday tasks such as typing or spreadsheeting. All the extra speed has an impact only on some things and most is being wasted by poor code. Some of the delays we had 20 and 30 years ago still affect us just as badly today.

The main point though is that if you can make thousands of processors on a standard sized chip, you don’t have to run multitasking. Each task could have a processor all to itself.

The operating system currently runs programs to check all the processes that need attention, determine their priorities, schedule processing for them, and copy their data in and out of memory. That is not needed if each process can have its own dedicated processor and memory all the time. There are lots of ways of using basic physics to allocate processes to processors, relying on basic statistics to ensure that collisions rarely occur. No code is needed at all.

An ultra-simple computer could therefore have a large pool of powerful, free processors, each with their own memory, allocated on demand using simple physical processes. (I will describe a few options for the basic physics processes later). With no competition for memory or processing, a lot of delays would be eliminated too.

Ultra-simple computing: Part 1

Introduction

This is first part of a techie series. If you aren’t interested in computing, move along, nothing here. It is a big topic so I will cover it in several manageable parts.

Like many people, I spent a good few hours changing passwords after the Heartbleed problem and then again after ebay’s screw-up. It is a futile task in some ways because passwords are no longer a secure defense anyway. A decent hacker with a decent computer can crack hundreds of passwords in an hour, so unless an account is locked after a few failed attempts, and many aren’t, passwords only manage to keep out casual observers and the most amateurish hackers.

The need for simplicity

A lot of problems are caused by the complexity of today’s software, making it impossible to find every error and hole. Weaknesses have been added to operating systems, office automation tools and browsers to increase functionality for only a few users, even though they add little to most of us most of the time. I don’t think I have ever executed a macro in Microsoft office for example and I’ve certainly never used print merge or many its other publishing and formatting features. I was perfectly happy with Word 93 and most things added since then (apart from the real time spelling and grammar checker) have added irrelevant and worthless features at the expense of safety. I can see very little user advantage of allowing pop-ups on web sites, or tracking cookies. Their primary purpose is to learn about us to make marketing more precise. I can see why they want that, but I can’t see why I should. Users generally want pull marketing, not push, and pull doesn’t need cookies, there are better ways of sending your standard data when needed if that’s what you want to do. There are many better ways of automating logons to regular sites if that is needed.

In a world where more of the people who wish us harm are online it is time to design an alternative platform which it is designed specifically to be secure from the start and no features are added that allow remote access or control without deliberate explicit permission. It can be done. A machine with a strictly limited set of commands and access can be made secure and can even be networked safely. We may have to sacrifice a few bells and whistles, but I don’t think we will need to sacrifice many that we actually want or need. It may be less easy to track us and advertise at us or to offer remote machine analysis tools, but I can live with that and you can too. Almost all the services we genuinely want can still be provided. You could still browse the net, still buy stuff, still play games with others, and socialize. But you wouldn’t be able to install or run code on someone else’s machine without their explicit knowledge. Every time you turn the machine on, it would be squeaky clean. That’s already a security benefit.

I call it ultra-simple computing. It is based on the principle that simplicity and a limited command set makes it easy to understand and easy to secure. That basic physics and logic is more reliable than severely bloated code. That enough is enough, and more than that is too much.

We’ve been barking up the wrong trees

There are a few things you take for granted in your IT that needn’t be so.

Your PC has an extremely large operating system. So does your tablet, your phone, games console… That isn’t really necessary. It wasn’t always the case and it doesn’t have to be the case tomorrow.

Your operating system still assumes that your PC has only a few processing cores and has to allocate priorities and run-time on those cores for each process. That isn’t necessary.

Although you probably use some software in the cloud, you probably also download a lot of software off the net or install from a CD or DVD. That isn’t necessary.

You access the net via an ISP. That isn’t necessary. Almost unavoidable at present, but only due to bad group-think. Really, it isn’t necessary.

You store data and executable code in the same memory and therefore have to run analysis tools that check all the data in case some is executable. That isn’t necessary.

You run virus checkers and firewalls to prevent unauthorized code execution or remote access. That isn’t necessary.

Overall, we live with an IT system that is severely unfit for purpose. It is dangerous, bloated, inefficient, excessively resource and energy intensive, extremely fragile and yet vulnerable to attack via many routes, designed with the user as a lower priority than suppliers, with the philosophy of functionality at any price. The good news is that it can be replaced by one that is absolutely fit for purpose, secure, invulnerable, cheap and reliable, resource-efficient, and works just fine. Even better, it could be extremely cheap so you could have both and live as risky an online life in those areas that don’t really matter, knowing you have a safe platform to fall back on when your risky system fails or when you want to do anything that involves your money or private data.