Tag Archives: surveillance

Future Surveillance

This is an update of my last surveillance blog 6 years ago, much of which is common discussion now. I’ll briefly repeat key points to save you reading it.

They used to say

“Don’t think it

If you must think it, don’t say it

If you must say it, don’t write it

If you must write it, don’t sign it”

Sadly this wisdom is already as obsolete as Asimov’s Laws of Robotics. The last three lines have already been automated.

I recently read of new headphones designed to recognize thoughts so they know what you want to listen to. Simple thought recognition in various forms has been around for 20 years now. It is slowly improving but with smart networked earphones we’re already providing an easy platform into which to sneak better monitoring and better though detection. Sold on convenience and ease of use of course.

You already know that Google and various other large companies have very extensive records documenting many areas of your life. It’s reasonable to assume that any or all of this could be demanded by a future government. I trust Google and the rest to a point, but not a very distant one.

Your phone, TV, Alexa, or even your networked coffee machine may listen in to everything you say, sending audio records to cloud servers for analysis, and you only have naivety as defense against those audio records being stored and potentially used for nefarious purposes.

Some next generation games machines will have 3D scanners and UHD cameras that can even see blood flow in your skin. If these are hacked or left switched on – and social networking video is one of the applications they are aiming to capture, so they’ll be on often – someone could watch you all evening, capture the most intimate body details, film your facial expressions and gaze direction while you are looking at a known image on a particular part of the screen. Monitoring pupil dilation, smiles, anguished expressions etc could provide a lot of evidence for your emotional state, with a detailed record of what you were watching and doing at exactly that moment, with whom. By monitoring blood flow and pulse via your Fitbit or smartwatch, and additionally monitoring skin conductivity, your level of excitement, stress or relaxation can easily be inferred. If given to the authorities, this sort of data might be useful to identify pedophiles or murderers, by seeing which men are excited by seeing kids on TV or those who get pleasure from violent games, and it is likely that that will be one of the justifications authorities will use for its use.

Millimetre wave scanning was once controversial when it was introduced in airport body scanners, but we have had no choice but to accept it and its associated abuses –  the only alternative is not to fly. 5G uses millimeter wave too, and it’s reasonable to expect that the same people who can already monitor your movements in your home simply by analyzing your wi-fi signals will be able to do a lot better by analyzing 5G signals.

As mm-wave systems develop, they could become much more widespread so burglars and voyeurs might start using them to check if there is anything worth stealing or videoing. Maybe some search company making visual street maps might ‘accidentally’ capture a detailed 3d map of the inside of your house when they come round as well or instead of everything they could access via your wireless LAN.

Add to this the ability to use drones to get close without being noticed. Drones can be very small, fly themselves and automatically survey an area using broad sections of the electromagnetic spectrum.

NFC bank and credit cards not only present risks of theft, but also the added ability to track what we spend, where, on what, with whom. NFC capability in your phone makes some parts of life easier, but NFC has always been yet another doorway that may be left unlocked by security holes in operating systems or apps and apps themselves carry many assorted risks. Many apps ask for far more permissions than they need to do their professed tasks, and their owners collect vast quantities of information for purposes known only to them and their clients. Obviously data can be collected using a variety of apps, and that data linked together at its destination. They are not all honest providers, and apps are still very inadequately regulated and policed.

We’re seeing increasing experimentation with facial recognition technology around the world, from China to the UK, and only a few authorities so far such as in San Francisco have had the wisdom to ban its use. Heavy handed UK police, who increasingly police according to their own political agenda even at the expense of policing actual UK law, have already fined people who have covered themselves to avoid being abused in face recognition trials. It is reasonable to assume they would gleefully seize any future opportunity to access and cross-link all of the various data pools currently being assembled under the excuse of reducing crime, but with the real intent of policing their own social engineering preferences. Using advanced AI to mine zillions of hours of full-sensory data input on every one of us gathered via all this routine IT exposure and extensive and ubiquitous video surveillance, they could deduce everyone’s attitudes to just about everything – the real truth about our attitudes to every friend and family member or TV celebrity or politician or product, our detailed sexual orientation, any fetishes or perversions, our racial attitudes, political allegiances, attitudes to almost every topic ever aired on TV or everyday conversation, how hard we are working, how much stress we are experiencing, many aspects of our medical state.

It doesn’t even stop with public cameras. Innumerable cameras and microphones on phones, visors, and high street private surveillance will automatically record all this same stuff for everyone, sometimes with benign declared intentions such as making self-driving vehicles safer, sometimes using social media tribes to capture any kind of evidence against ‘the other’. In depth evidence will become available to back up prosecutions of crimes that today would not even be noticed. Computers that can retrospectively date mine evidence collected over decades and link it all together will be able to identify billions of real or invented crimes.

Active skin will one day link your nervous system to your IT, allowing you to record and replay sensations. You will never be able to be sure that you are the only one that can access that data either. I could easily hide algorithms in a chip or program that only I know about, that no amount of testing or inspection could ever reveal. If I can, any decent software engineer can too. That’s the main reason I have never trusted my IT – I am quite nice but I would probably be tempted to put in some secret stuff on any IT I designed. Just because I could and could almost certainly get away with it. If someone was making electronics to link to your nervous system, they’d probably be at least tempted to put a back door in too, or be told to by the authorities.

The current panic about face recognition is justified. Other AI can lipread better than people and recognize gestures and facial expressions better than people. It adds the knowledge of everywhere you go, everyone you meet, everything you do, everything you say and even every emotional reaction to all of that to all the other knowledge gathered online or by your mobile, fitness band, electronic jewelry or other accessories.

Fools utter the old line: “if you are innocent, you have nothing to fear”. Do you know anyone who is innocent? Of everything? Who has never ever done or even thought anything even a little bit wrong? Who has never wanted to do anything nasty to anyone for any reason ever? And that’s before you even start to factor in corruption of the police or mistakes or being framed or dumb juries or secret courts. The real problem here is not the abuses we already see. It is what is being and will be collected and stored, forever, that will be available to all future governments of all persuasions and police authorities who consider themselves better than the law. I’ve said often that our governments are often incompetent but rarely malicious. Most of our leaders are nice guys, only a few are corrupt, but most are technologically inept . With an increasingly divided society, there’s a strong chance that the ‘wrong’ government or even a dictatorship could get in. Which of us can be sure we won’t be up against the wall one day?

We’ve already lost the battle to defend privacy. The only bits left are where the technology hasn’t caught up yet. In the future, not even the deepest, most hidden parts of your mind will be private. Pretty much everything about you will be available to an AI-upskilled state and its police.

The future of freedom of speech

This is mainly about the UK, but some applies elsewhere too.

The UK Police are in trouble yet again for taking the side of criminals against the law-abiding population. Our police seem to have frequent trouble with understanding the purpose of their existence. This time in the wake of the Charlie Hebdo murders, some police forces decided that their top priority was not to protect freedom of speech nor to protect law-abiding people from terrorists, but instead to visit the newsagents that were selling Charlie Hebdo and get the names of people buying copies. Charlie Hebdo has become synonymous with the right to exercise freedom of speech, and by taking names of its buyers, those police forces have clearly decided that Charlie Hebdo readers are the problem, not the terrorists. Some readers might indeed present a threat, but so might anyone in the population. Until there is evidence to suspect a crime, or at the very least plotting of a crime, it is absolutely no rightful business of the police what anyone does. Taking names of buyers treats them as potential suspects for future hate crimes. It is all very ‘Minority Report’, mixed with more than a touch of ‘Nineteen-eighty-four’. It is highly disturbing.

The Chief Constable has since clarified to the forces that this was overstepping the mark, and one of the offending forces has since apologised. The others presumably still think they were in the right. I haven’t yet heard any mention of them saying they have deleted the names from their records.

This behavior is wrong but not surprising. The UK police often seem to have socio-political agendas that direct their priorities and practices in upholding the law, individually and institutionally.

Our politicians often pay lip service to freedom of speech while legislating for the opposite. Clamping down on press freedom and creation of thought crimes (aka hate crimes) have both used the excuse of relatively small abuses of freedom to justify taking away our traditional freedom of speech. The government reaction to the Charlie Hebdo massacre was not to ensure that freedom of speech is protected in the UK, but to increase surveillance powers and guard against any possible backlash. The police have also become notorious for checking social media in case anyone has said anything that could possibly be taken as offensive by anyone. Freedom of speech only remains in the UK provided you don’t say anything that anyone could claim to be offended by, unless you can claim to be a member of a preferred victim group, in which case it sometimes seems that you can do or say whatever you want. Some universities won’t even allow some topics to be discussed. Freedom of speech is under high downward pressure.

So where next? Privacy erosion is a related problem that becomes lethal to freedom when combined with a desire for increasing surveillance. Anyone commenting on social media already assumes that the police are copied in, but if government gets its way, that will be extended to list of the internet services or websites you visit, and anything you type into search. That isn’t the end though.

Our televisions and games consoles listen in to our conversation (to facilitate voice commands) and send some of the voice recording to the manufacturers. We should expect that many IoT devices will do so too. Some might send video, perhaps to facilitate gesture recognition, and the companies might keep that too. I don’t know whether they data mine any of it for potential advertising value or whether they are 100% benign and only use it to deliver the best possible service to the user. Your guess is as good as mine.

However, since the principle has already been demonstrated, we should expect that the police may one day force them to give up their accumulated data. They could run a smart search on the entire population to find any voice or video samples or photos that might indicate anything remotely suspicious, and could then use legislation to increase monitoring of the suspects. They could make an extensive suspicion database for the whole population, just in case it might be useful. Given that there is already strong pressure to classify a wide range of ordinary everyday relationship rows or financial quarrels as domestic abuse, this is a worrying prospect. The vast majority of the population have had arguments with a partner at some time, used a disparaging comment or called someone a name in the heat of the moment, said something in the privacy of their home that they would never dare say in public, used terminology that isn’t up to date or said something less than complimentary about someone on TV. All we need now to make the ‘Demolition Man’ automated fine printout a reality is more time and more of the same government and police attitudes as we are accustomed to.

The next generation of software for the TVs and games consoles could easily include monitoring of eye gaze direction, maybe some already do. It might need that for control (e.g look and blink), or to make games smarter or for other benign reasons. But when the future police get the records of everything you have watched, what image was showing on that particular part of the screen when you made that particular expression, or made that gesture or said that, then we will pretty much have the thought police. They could get a full statistical picture of your attitudes to a wide range of individuals, groups, practices, politics or policies, and a long list of ‘offences’ for anyone they don’t like this week. None of us are saints.

The technology is all entirely feasible in the near future. What will make it real or imaginary is the attitude of the authorities, the law of the land and especially the attitude of the police. Since we are seeing an increasing disconnect between the police and the intent behind the law of the land, I am not the only one that this will worry.

We’ve already lost much of our freedom of speech in the UK. If we do not protest loudly enough and defend what we have left, we will soon lose the rest, and then lose freedom of thought. Without the freedom to think what you want, you don’t have any freedom worth having.

 

The future of prying

Prying is one side of the privacy coin, hiding being the other side.

Today, lots of snap-chat photos have been released, and no doubt some people are checking to see if there are any of people they know, and it is a pretty safe bet that some will send links to compromising pics of colleagues (or teachers) to others who know them. It’s a sort of push prying isn’t it?

There is more innocent prying too. Checking out Zoopla to see how much your neighbour got for their house is a little bit nosy but not too bad, or at the extremely innocent end of the line, reading someone’s web page is the sort of prying they actually want some people to do, even if not necessarily you.

The new security software I just installed lets parents check out on their kids online activity. Protecting your kids is good but monitoring every aspect of their activity just isn’t, it doesn’t give them the privacy they deserve and probably makes them used to being snooped on so that they accept state snooping more easily later in life. Every parent has to draw their own line, but kids do need to feel trusted as well as protected.

When adults install tracking apps on their partner’s phones, so they can see every location they’ve visited and every call or message they’ve made, I think most of us would agree that is going too far.

State surveillance is increasing rapidly. We often don’t even think of it as such, For example, when speed cameras are linked ‘so that the authorities can make our roads safer’, the incidental monitoring and recording of our comings and goings collected without the social debate. Add that to the replacement of tax discs by number plate recognition systems linked to databases, and even more data is collected. Also ‘to reduce crime’, video from millions of CCTV cameras is also stored and some is high enough quality to be analysed by machine to identify people’s movements and social connectivity. Then there’s our phone calls, text messages, all the web and internet accesses, all these need to be stored, either in full or at least the metadata, so that ‘we can tackle terrorism’. The state already has a very full picture of your life, and it is getting fuller by the day. When it is a benign government, it doesn’t matter so much, but if the date is not erased after a short period, then you need also to worry about future governments and whether they will also be benign, or whether you will be one of the people they want to start oppressing. You also need to worry that increasing access is being granted to your data to a wider variety of a growing number of public sector workers for a widening range of reasons, with seemingly lower security competence, meaning that a good number of people around you will be able to find out rather more about you than they really ought. State prying is always sold to the electorate via assurances that it is to make us safer and more secure and reduce crime, but the state is staffed by your neighbors, and in the end, that means that your neighbors can pry on you.

Tracking cookies are a fact of everyday browsing but mostly they are just trying to get data to market to us more effectively. Reading every email to get data for marketing may be stretching the relationship with the customer to the limits, but many of us gmail users still trust Google not to abuse our data too much and certainly not to sell on our business dealings to potential competitors. It is still prying though, however automated it is, and a wider range of services are being linked all the time. The internet of things will provide data collection devices all over homes and offices too. We should ask how much we really trust global companies to hold so much data, much of it very personal, which we’ve seen several times this year may be made available to anyone via hackers or forced to be handed over to the authorities. Almost certainly, bits of your entire collected and processed electronic activity history could get you higher insurance costs, in trouble with family or friends or neighbors or the boss or the tax-man or the police. Surveillance doesn’t have to be real time. Databases can be linked, mashed up, analysed with far future software or AI too. In the ongoing search for crimes and taxes, who knows what future governments will authorize? If you wouldn’t make a comment in front of a police officer or tax-man, it isn’t safe to make it online or in a text.

Allowing email processing to get free email is a similar trade-off to using a supermarket loyalty card. You sell personal data for free services or vouchers. You have a choice to use that service or another supermarket or not use the card, so as long as you are fully aware of the deal, it is your lifestyle choice. The lack of good competition does reduce that choice though. There are not many good products or suppliers out there for some services, and in a few there is a de-facto monopoly. There can also be a huge inconvenience and time loss or social investment cost in moving if terms and conditions change and you don’t want to accept the deal any more.

On top of that state and global company surveillance, we now have everyone’s smartphones and visors potentially recording anything and everything we do and say in public and rarely a say in what happens to that data and whether it is uploaded and tagged in some social media.

Some companies offer detective-style services where they will do thorough investigations of someone for a fee, picking up all they can learn from a wide range of websites they might use. Again, there are variable degrees that we consider acceptable according to context. If I apply for a job, I would think it is reasonable for the company to check that I don’t have a criminal record, and maybe look at a few of the things I write or tweet to see what sort of character I might be. I wouldn’t think it appropriate to go much further than that.

Some say that if you have done nothing wrong, you have nothing to fear, but none of them has a 3 digit IQ. The excellent film ‘Brazil’ showed how one man’s life was utterly destroyed by a single letter typo in a system scarily similar to what we are busily building.

Even if you are a saint, do you really want the pervert down the road checking out hacked databases for personal data on you or your family, or using their public sector access to see all your online activity?

The global population is increasing, and every day a higher proportion can afford IT and know how to use it. Networks are becoming better and AI is improving so they will have greater access and greater processing potential. Cyber-attacks will increase, and security leaks will become more common. More of your personal data will become available to more people with better tools, and quite a lot of them wish you harm. Prying will increase geometrically, according to Metcalfe’s Law I think.

My defense against prying is having an ordinary life and not being famous or a major criminal, not being rich and being reasonably careful on security. So there are lots of easier and more lucrative targets. But there are hundreds of millions of busybodies and jobsworths and nosy parkers and hackers and blackmailers out there with unlimited energy to pry, as well as anyone who doesn’t like my views on a topic so wants to throw some mud, and their future computers may be able to access and translate and process pretty much anything I type, as well as much of what I say and do anywhere outside my home.

I find myself self-censoring hundreds of times a day. I’m not paranoid. There are some people out to get me, and you, and they’re multiplying fast.

 

 

 

Deep surveillance – how much privacy could you lose?

The news that seems to have caught much of the media in shock, that our electronic activities were being monitored, comes as no surprise at all to anyone working in IT for the last decade or two. In fact, I can’t see what’s new. I’ve always assumed since the early 90s that everything I write and do on-line or say or text on a phone or watch on digital TV or do on a game console is recorded forever and checked by computers now or will be checked some time in the future for anything bad. If I don’t want anyone to know I am thinking something, I keep it in my head. Am I paranoid? No. If you think I am, then it’s you who is being naive.

I know that if some technically competent spy with lots of time and resources really wants to monitor everything I do day and night and listen to pretty much everything I say, they could, but I am not important enough, bad enough, threatening enough or even interesting enough, and that conveys far more privacy than any amount of technology barriers ever could. I live in a world of finite but just about acceptable risk of privacy invasion. I’d like more privacy, but it’s too much hassle.

Although government, big business and malicious software might want to record everything I do just in case it might be useful one day, I still assume some privacy, even if it is already technically possible to bypass it. For example, I assume that I can still say what I want in my home without the police turning up even if I am not always politically correct. I am well aware that it is possible to use a function built into the networks called no-ring dial-up to activate the microphone on my phones without me knowing, but I assume nobody bothers. They could, but probably don’t. Same with malware on my mobiles.

I also assume that the police don’t use millimetre wave scanning to video me or my wife through the walls and closed curtains. They could, but probably don’t. And there are plenty of sexier targets to point spycams at so I am probably safe there too.

Probably, nobody bothers to activate the cameras on my iphone or Nexus, but I am still a bit cautious where I point them, just in case. There is simply too much malware out there to ever assume my IT is safe. I do only plug a camera and microphone into my office PC when I need to. I am sure watching me type or read is pretty boring, and few people would do it for long, but I have my office blinds drawn and close the living room curtains in the evening for the same reason – I don’t like being watched.

In a busy tube train, it is often impossible to stop people getting close enough to use an NFC scanner to copy details from my debit card and Barclaycard, but they can be copied at any till or in any restaurant just as easily, so there is a small risk but it is both unavoidable and acceptable. Banks discovered long ago that it costs far more to prevent fraud 100% than it does to just limit it and accept some. I adopt a similar policy.

Enough of today. What of tomorrow? This is a futures blog – usually.

Well, as MM Wave systems develop, they could become much more widespread so burglars and voyeurs might start using them to check if there is anything worth stealing or videoing. Maybe some search company making visual street maps might ‘accidentally’ capture a detailed 3d map of the inside of your house when they come round as well or instead of everything they could access via your wireless LAN. Not deliberately of course, but they can’t check every line of code that some junior might have put in by mistake when they didn’t fully understand the brief.

Some of the next generation games machines will have 3D scanners and HD cameras that can apparently even see blood flow in your skin. If these are hacked or left switched on – and social networking video is one of the applications they are aiming to capture, so they’ll be on often – someone could watch you all evening, capture the most intimate body details, film your facial expressions while you are looking at a known image on a particular part of the screen. Monitoring pupil dilation, smiles, anguished expressions etc could provide a lot of evidence for your emotional state, with a detailed record of what you were watching and doing at exactly that moment, with whom. By monitoring blood flow, pulse and possibly monitoring your skin conductivity via the controller, level of excitement, stress or relaxation can easily be inferred. If given to the authorities, this sort of data might be useful to identify paedophiles or murderers, by seeing which men are excited by seeing kids on TV or those who get pleasure from violent games, so obviously we must allow it, mustn’t we? We know that Microsoft’s OS has had the capability for many years to provide a back door for the authorities. Should we assume that the new Xbox is different?

Monitoring skin conductivity is already routine in IT labs as an input. Thought recognition is possible too and though primitive today, we will see that spread as the technology progresses. So your thoughts can be monitored too. Thoughts added to emotional reactions and knowledge of circumstances would allow a very detailed picture of someone’s attitudes. By using high speed future computers to data mine zillions of hours of full sensory data input on every one of us gathered via all this routine IT exposure, a future government or big business that is prone to bend the rules could deduce everyone’s attitudes to just about everything – the real truth about our attitudes to every friend and family member or TV celebrity or politician or product, our detailed sexual orientation, any fetishes or perversions, our racial attitudes, political allegiances, attitudes to almost every topic ever aired on TV or everyday conversation, how hard we are working, how much stress we are experiencing, many aspects of our medical state. And they could steal your ideas, if you still have any after putting all your effort into self censorship.

It doesn’t even stop there. If you dare to go outside, innumerable cameras and microphones on phones, visors, and high street surveillance will automatically record all this same stuff for everyone. Thought crimes already exist in many countries including the UK. In depth evidence will become available to back up prosecutions of crimes that today would not even be noticed. Computers that can retrospectively date mine evidence collected over decades and link it all together will be able to identify billions of crimes.

Active skin will one day link your nervous system to your IT, allowing you to record and replay sensations. You will never be able to be sure that you are the only one that can access that data either. I could easily hide algorithms in a chip or program that only I know about, that no amount of testing or inspection could ever reveal. If I can, any decent software engineer can too. That’s the main reason I have never trusted my IT – I am quite nice but I would probably be tempted to put in some secret stuff on any IT I designed. Just because I could and could almost certainly get away with it. If someone was making electronics to link to your nervous system, they’d probably be at least tempted to put a back door in too, or be told to by the authorities.

Cameron utters the old line: “if you are innocent, you have nothing to fear”. Only idiots believe that. Do you know anyone who is innocent? Of everything? Who has never ever done or even thought anything even a little bit wrong? Who has never wanted to do anything nasty to a call centre operator? And that’s before you even start to factor in corruption of the police or mistakes or being framed or dumb juries or secret courts. The real problem here is not what Prism does and what the US authorities are giving to our guys. It is what is being and will be collected and stored, forever, that will be available to all future governments of all persuasions. That’s the problem. They don’t delete it. I’ve said often that our governments are often incompetent but not malicious. Most of our leaders are nice guys, even if some are a little corrupt in some cases. But what if it all goes wrong, and we somehow end up with a deeply divided society and the wrong government or a dictatorship gets in. Which of us can be sure we won’t be up against the wall one day?

We have already lost the battle to defend our privacy. Most of it is long gone, and the only bits left are those where the technology hasn’t caught up yet. In the future, not even the deepest, most hidden parts of your mind will be private. Ever.