The Early Days of a Better Nation

Tuesday, December 02, 2008



All Your Firewall Are Belong to Us



[Note: this is lightly edited version of a talk I gave on 30 September at this year's Gartner IT Security Summit, and later rewired as a GoH talk at Octocon. The talk depends a lot on topicality, so (a) the dates matter and (b) I can't squeeze any more talks out of it, so here it is. The picture above isn't actually of me giving that talk but it does fit what I was saying: the future isn't a straight line! It's wiggly!]

Good afternoon, and thank you for coming. I was a little perplexed when I was asked to speak about the future to a conference on IT security. Having worked ten years in IT is of course no guarantee that you know anything about security, in fact I suspect the average programmer gives people like you more sleepless nights than the average user does. As for writing science fiction, it's more or less a guarantee that you're going to get the future wrong. The science fiction writer Jack Vance gave due warning about predictions:

"What are your fees?" inquired Guyal cautiously. "I respond to three questions," stated the augur. "For twenty terces I phrase the answer in clear and actionable language; for ten I use the language of cant, which occasionally admits of ambiguity; for five, I speak a parable which you must interpret as you will; and for one terce, I babble in an unknown tongue." [1]

This may also be true of management consultants.

Today I'm going for clear and actionable language, on the basis that more than twenty years ago I saw this day coming. Some time around about 1986 the Reader in Mechanical Engineering at Brunel University told me that his best students weren't going into engineering, they were going to work for much higher salaries in the City, which had just undergone the famous Big Bang deregulation. And it did occur to me to think that a society whose best young engineers were going into finance was laying up some trouble for itself. I must admit I didn't actually foresee reading in New Scientist in September 2008 that the graduates most in demand by the City were literally rocket scientists.

There are two problems with thinking about the future, apart from that we can't see it. One is that we like stories, we're a story-telling species, and our heads are cluttered up with stories. But stories have a dramatic structure and usually a somehow satisfying ending. The future doesn't work like that. The other is that that we tend to think that things will continue along the same lines, like projecting a line on a graph. But the future doesn't work like that, either.
Shares, as the small print - well, today it's big black print - always reminds us, can go down as well as up. So can trends. This is why, for example, there isn't a US and a Soviet moon base, like in 2001: a Space Odyssey. It was a perfectly reasonable projection in 1969, because we'd had decades of big-scale state-led projects: the American freeway system, the welfare state, Keynesian economics, nuclear power, the space programme, and even the Soviet Union at that time was still looking pretty solid. There seemed no reason why that trend shouldn't continue. So nearly every writer of the sixties assumed we'd have marvellous mass transit systems in 2008 and lots of people would be living in domes on the Moon. And the whole world would have one big computer. The more imaginative had one big computer for each continent. Some time in the early 80s it became clear that this was yesterday's tomorrow.

Most of you, I'm sure, are familiar with the concept of legacy code. It has various definitions, but when I was working in IT back in the 80s we tended to use it for code inherited from someone else, which in the case of London Electricity usually meant someone who had retired or was otherwise no longer available for anything short of unfeasible amounts of money because they'd gone to work in the City to write the programs to implement the complex financial instruments being designed by all those engineers and rocket scientists from Brunel. In either case, that someone would have left an incredibly complex, system-critical, uncommented and undocumented tangle of code. A perhaps over-familiar example of legacy code from that time is the so-called Millennium Bug. Late in 1989 I was modifying a billing program for London Electricity so that the date-printing module could handle the change from 80s to 90s, and I pointed out that there was going to be another decade rollover 10 years later that would change the date from 19-something to 20-something, and that while we were at it we might as well fix that too. Most of the programmers there were science fiction readers, and unlike normal people we were pretty sure the 21st century was actually going to happen one of these days. We then had a discussion, took it to our line management, and they decided to fix the problem in all the other programs. So if you want to know who saved the world from the Millenium Bug, it was science fiction fans. Actually what I'm proudest of is the comment I put beside the fix: This will work for 2000 and 2100 but for subsequent century dates may require amendment. That's legacy code.

I used the term 'legacy code' in one of my novels, and Farah Mendlesohn, a science-fiction critic who read it thought it was a term I had made up, and she promptly adapted it for critical use as 'legacy text'. Legacy text is all the other science fiction stories that influence the story you're trying to write, and that generally clutter up your head even if you never read, let along write, the stuff. Most of us have default images of the future that come from Star Trek or 2001 or 1984 or Dr Who or disaster movies or computer games. These in turn interact with the tendency to project trends straightforwardly into the future. The great legacy text for the past twenty years or so is William Gibson's Neuromancer. Because Gibson was the guy who saw in the early 1980s that the next twenty years weren't going to be like the fifties and sixties in space. He famously foresaw something like the World Wide Web, but more importantly his novel projects a world where the US military is up there dominating the world from above, while down below the rest of us, from multinational corporations to lone hackers, sort of scrabble around in anarchy. I've written plenty of stories along the same lines, some of which may have become legacy texts themselves.

What if all the stories whose legacy text is Neuromancer have just turned into yesterday's tomorrow? What if right now, we're at a moment when trends that looked inexorable have reached a turning point? What if the common sense of the age is about to flip from free-market capitalism to state-regulated capitalism? Of course, turning that into actual policy won't happen overnight, or smoothly - too much political legacy code - but if it does happen then over the next ten years or so we'll be in an age of big government projects, some sort of new New Deal. We'd find ourselves back in the day before yesterday's tomorrow.

What that would mean is that every problem that has loomed large recently, from global warming to terrorism to Peak Oil to poverty to obesity, suddenly starts getting tackled in a quite different way, from the top down instead of from the bottom up. Or rather, macro-managed instead of micro-managed. People might actually feel freer, because instead of ubiquitous surveillance and endless nagging and hassling of individuals to change their behaviour and worry about their diet or carbon footprint or recyling or whatever, the governments would go straight to building non-carbon and hopefully nuclear power stations and installing reliable public transport and bike lanes, and instead of having us peering at small print on labels simply force the food manufacturers to stop putting fructose and transfats and god knows what other rubbish into the food in the first place. The problems do not go away but the whole attitude towards tackling them changes. Climate change starts looking very different if it begins to be seen in terms of building massive new power sources, transport systems, various schemes of geo-engineering - terraforming Earth, as we call it in the science-fiction biz - instead of cutting back on what we've already got and stopping Africa and Asia from getting it too.

What would such a big turning point mean for IT security? Well, the IT security problem of the past twenty years or so has come from the interaction of two processes: the proliferation and cheapening of computers, and the growth of insecurity and inequality. Some computers - in fact most computers - end up in the hands of people who are clueless about computers. Others end up in the hands, or at least under the fingers, of people who are unemployed or underemployed or criminally employed, and they've progressed from writing viruses to writing ever more sophisticated hacks, spam, scams and exploits. It has been a by-product of the processes that have also led to what Misha Gleeny in his book McMafia has called the globalisation of crime.

Now the problem of IT security will not go away, but the very nature of the problem changes if the education system has to adapt to preparing people for manufacture instead of McJobs (or finance), and if there are big technology-heavy projects to soak up the script kiddies and hackers and spammers and scammers into doing something more productive and useful and indeed profitable. And if the West is engaging in big development projects, it can hardly sic the IMF and World Bank on developing countries to let the market rip and shut down their public sectors and protected industries. So you might see again in the Middle East and Africa actual secular progressive regimes, which in the long run would soak up a lot of the swamp that terrorism comes from, not to mention Nigerian scams.

All this would take place against the background of one trend that's probably not going to reverse any time soon: Moore's Law, which very roughly means that next year's computer would do twice as much twice as fast for the same price if it weren't for Microsoft. There are similar trends in information storage and bandwidth. My friend Charles Stross, who knows far more about these things than I do, has written a brilliant analysis of the consequences of Moore's Law, in a talk you can find on his website, Charlie's Diary. I can't really better than that so I'm just going to say we can assume that over the next couple of decades the amount of computing power, the amount of information that that processing power has available to process, and the amount of bandwidth available to pass information around, are all going to increase to levels that very quickly take you into science fiction territory. Say they double every two years. In twenty years you get ten doublings: 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024. At that point you probably hit the buffers of fundamental physical limits. But along the way you have to envisage a world within your own professional lifetime and indeed business planning horizon where most people have something like a Cray supercomputer in their pocket and access it on their glasses or contact lenses.

Now if things were to continue the way they have been over the past twenty years, this is a huge problem for IT security, because you have vast quantities of personal information out there, very insecure, and vast processing resources to throw at it. This translates into immense opportunities for social engineering hacks. If you know the names and birthdays and maiden names and schools and for all I know Amazon picks of not just everyone but of everyone they know, security questions don't look like much of a barrier. The way Sarah Palin's Google mail got hacked is a very small example of how that can already be done. And to be honest it was because I was thinking along those lines and reading about the Russian Business Network and the Nigerian scam and the cyber attacks on Estonia and Georgia and so on that I came up with the title of this talk, 'All your firewall are belong to us.' But as I say, I think the nature of the problem is going to change. It changes from protecting people from their own cupidity and stupidity to protecting large infrastructure and manufacture from unintended security flaws and from very well-designed attacks, whether sabotage or espionage. Because the world I've just outlined is no more one of peace and harmony than the world of the postwar boom was. Instead of one Cold War there could be several going on simultaneously, in which all sides will be engaged in carrying out and in defending against cyber attacks.

It is, however, a world where IT is going to be even more important than it has been in the world of finance capital. And its focus is going to be different. Instead of going into the City or Wall Street, engineers are going to be working on engines, rocket scientists working on rockets. And instead of applying arcane probability calculations to designing improbable financial instruments, we could see a resurgence and elaboration of the sort of applications that in the sixties it was widely expected computers would mainly get used for: physical and economic planning. Applications like linear programming, critical path analysis, input-output tables, material requirement planning. In the early 90s when I was working for Feranti, I mean GEC, I mean BAE Systems, I worked on the edges of an MRP II manufacturing resource planning system and I was surprised that such a thing actually existed, while we were at the same time turning over the NHS to pseudo-market systems being developed at vast expense. Robust planning systems already exist but are often disrupted by economic turbulence arising out of the activities of all those Stock Exchange rocket scientists. Which is where we came in.

What I would like to wind up with in terms of thinking about the future is, as advertised, things that are unlikely to happen and things that look unlikely now but probably will happen and hit us all upside the head. As a science fiction writer I'm under a contractual obligation to mention aliens. I boldly say we will not encounter aliens. OK, so much for that. What about other unpredictable developments? There are plenty of catastrophic risks out there, as well as exciting possiblities. Leaving aside giant meteor impacts, local or global nuclear war, new diseases, Greenland's ice sliding into the North Atlantic in one go, or another Yellowstone supervolcano eruption - i.e. things about which the IT industry can do very little to prepare for, other than having very good off-site back-up and disaster recovery programmes in place, preferably in the middle of Australia - there are a few things we can think about that tie in rather neatly with my restrained and conservative suggestion that the developed world is about to take a massive lurch to the left.

One of the things that recent SF, the Neuromancer-type SF of recent decades, got right was that even if humanity's power over big things seemed to have slipped from its grasp, its power over small things such as molecules was increasing very quickly indeed. This is in part a consequence of Moore's Law, in that researchers in genomics, proteomics, physiomics and all the other omics found they had more and more computer resources to throw at the problem. I don't think this is going to change or that it's anywhere near its limits, so I wouldn't be at all surprised if some of us here live long enough to find that biotechnology has come up with a cure for death. Well, for aging, cancer, and most kinds of crippling injury and disability, at least. It might even be a cosmetics company that does it. Immortality - because you're worth it. This would set the whole pension funds business in something of a new light, let's say. But it would also have consequences so profound that no SF writer to my knowledge has ever actually tackled in depth the possibility of a near-future world where the most annoying generation in history - mine - is still around in 2100 if not 3100. Perhaps the prospect just doesn't bear thinking about. Nevertheless, if you do want to imagine that kind of world, the world of the next couple of decades, I can recommend works like Bruce Sterling's Distraction and Holy Fire, Charles Stross's Accelerando, and Vernor Vinge's Rainbows End.

If people are going to live longer and healthier lives the next step might be to make people smarter. That seems a bit unlikely in the next ten years but it's not impossible. I understand that physicists are already notorious for popping pills to speed up their synapses.The only part of human intelligence enhancement that really impacts IT security is that people might then stop writing down their passwords on Post-It notes. With that however I suspect I may have strayed into the realms of fantasy. So we can expect ever more emphasis on biometric security, and you can be sure that by the time it's implemented it too will be hackable: not just in terms of hacking the IT systems that store the biometric data, but the live data itself - by tweaking bits of DNA to change the expression of particular genes, perhaps to change iris patterns or fingerprints. Eventually we will find ourselves in the world of the open source human.

Another area where ever-increasing computing power is expected to have radical consequences is of course Artificial Intelligence (AI). I've been saying for twenty years that human-equivalent AI has all my life been where it was when I was twenty years old myself - twenty years away. But ethereally-networked supercomputers in every pocket, with increasingly fine-grained interfaces with our own brains and bodies, may yet have some surprises in store. The biggest surprise might be that an AI that emerged out of that wouldn't notice us at all, any more than we notice our neurons. It wouldn't come to eat our brains because our brains are already eating for it. It would only be interested in talking to the aliens who we are likewise unaware of. That is in fact the end of that novel I mentioned earlier, Neuromancer, where the emergent AI tells a character about how an alien AI has already tried to make contact, but ''Til there was me, natch, there was nobody to know, nobody to answer.'

Finally, we can be sure that if there is indeed a big turn away from neo-liberalism and towards regulation and large-scale projects, lots and lots of things will go horribly wrong, fortunes unimaginable today will be squandered on gigantic schemes that never pay off, and conflicts and contradictions will build up until that trend, too, is reversed or turns into something different in twenty years time or less. And, no doubt, there'll be science fiction writers and futurologists standing up in front of conferences like this and saying that everything they've been saying for the past twenty years is all out of date and we have to brace ourselves for a sharp turn to the free market. With or without life-extension and smart pills, I hope we are all there to see it.

[1] From "Guyal of Sfere", The Dying Earth.
Vance wikiquote consulted 29/9/08

28 Comments:

>What if the common sense of the age is about to flip from free-market capitalism to state-regulated capitalism? Of course, turning that into actual policy won't happen overnight, or smoothly - too much political legacy code - but if it does happen then over the next ten years or so we'll be in an age of big government projects, some sort of new New Deal. We'd find ourselves back in the day before yesterday's tomorrow.


See my recent post in Grist Magazine on
Big Green, something I've been pushing since 1998.

not enough uranium, Ken, not enough, it should be saved for space travel, no better pack of punch per pound...

Nuclear rockets could cut cost of Moon base

seems rocket scientists have worked it out...

wish I had a good link on me, there's this pdf or this pdf, but known high grade uranium reserves would be used up in 10-20 years if used to replaced all current coal fired power stations, lower grade reserves require greater processing, are more expensive, though then of course then more economically viable, and while generally better than gas in greenhouse gas emissions (Nuclear energy becoming less sustainable, are still incredibly expensive

good thing state-lead capitalism will be there to socialise the cost of these white elephants

heaps of thorium though, much better bet, still a good thirty years away from being a real power source though, pity all those resources for fusion development have been spent in the last half century, seems to have been a goose chase.

damn wrong link

second pdf

nice discussion here too

If you want to save uranium for space travel then go implement big solar and wind. They complement each other nicely. Peak wind and peak sun mostly (not always) occur at different times. Use solar thermal not photovoltatic and you can store heat rather than electricity at the solar end. Wind tends to come and go in short burst so you can use comparatively small amounts of electrical storage. Connect them by long distance transmission, and voila stable baseload, load following and peak power, made reliable by the addition of only a little natural gas (5% or less).

Millenium! He wrote Millenium!

"[T]he governments would... simply force the food manufacturers to stop putting fructose and transfats and god knows what other rubbish into the food in the first place".

Considering that it was governments who forced the food manufacturers to put fructose and transfats into the food in the first place, the omens for a constructive outcome are not good.

"And if the West is engaging in big development projects, it can hardly sic the IMF and World Bank on developing countries to let the market rip and shut down their public sectors and protected industries". Oh, no? Where there's a law there's a loophole, and the golden rule is that he who has the gold makes the rules.

Oh, and you don't need uranium supplies for nuclear power once you've got started. You can breed U-233 from thorium, which is far more plentiful, and burn that.

"Millenium" bug fixed - thanks!

Re. Moore's Law:
There are two issues - more transitors, and faster processors.

The limit on transistors will be financial (is it worth building a $10 billion fab?). With current economics, things will slow down, and could stop in 5 to 10 years.

We've seen a leveling off in the "single thread" performance of processors. The latest "Nehalem" is shipping at 3.2 GHz, the first Core 2 processors hit... 3 GHz (Nehalem is maybe 10 or 20% faster clock for clock).

Of course, we are getting more cores, but that will take special programs...

'"Millenium" bug fixed' - oh, no, it isn't. You have fallen into the error of "just one small change" rather than going over everything, doing unit testing rather than system testing.

Meika wrote "heaps of thorium though, much better bet, still a good thirty years away from being a real power source though" - probably only ten years away, given past work and current efforts in places like India. It's straightforward development work from here on, not open ended research.

"pity all those resources for fusion development have been spent in the last half century, seems to have been a goose chase" - there's no such difficulty; the correct phrase is wild goose chase. Regular geese are no problem; they often even have their wings clipped.

FYI, "Whoissecretdubai" is not a UAE government agent but merely a stalker.

Utterly unrelated, but I'm happy to inform Ken McLeod that Cosmonaut Keep has just been translated and published in Italy (the two other instalments will follow in 2009)

note: chapcha word "imene", means "hymen" in Italian, and this is my first post, so I'm "virgin" here ain't it funny?

I agree, we need to save 'non-renewable' resources for projects that require them for the ultra long-term. Escape velocity from the gravity well requires (for now) the use of essentially irreplaceable fuels. Sunlight can make your car go, but it won't put a single stage to orbit vehicle up.

However, I applaud the realization that as more and more computing/processing and network communications ability is spread across the planet, IT or network security becomes incredibly more complex.

I mean, really, what do you do when you get hit with a denial of service attack on your clothing?
Where's the interface, even for an advanced user? Are you going to demand an OS on your t-shirt?
Will it be Windows? *cringe*
Or will it all be flash memory firmware?
How are you going to sync the update cycle between your running shoes and your glasses?
Not to mention that the divergence of source code - compatibility and legacy code issues as well - will only expand. A single standard will never work for everything and everyone, but interface standards will become the nightmare of the future. And you think international business travel is bad now when you can't plug in your shaver and your laptop. Let us pray it is neither USB nor FireWire.

A question that continues to get to me is how long we can maintain with large non-state actors as the source of international standards for things like internet, networking, and data transfer protocols. IPv6, theoretically, will last us forever. But there will inevitably be either a unnoticed downside, blindside, or massive shift. The electronic equivalent of going to every home in every nation, removing all the copper electrical wiring, and running fiber. And replacing all the bulbs. And the outlets. And the appliances.

Legacy code's real secret is that by the time you've lost the original writers or maintaining group and added an order of magnitude to the complexity of the program, any sufficient replacement project will be hideously difficult. Not to mention that by the time the replacement code is equal to the current code, the current code will do seven new out-of-spec things simply from maintenance upgrade.

Mind, this applies mostly to very large code projects. Not your webapplication storefront or your web browser. Or even your OS though Windows is heading there. More as to massive resource management and distribution networks, energy grid systems, etc.

Anyhow, rambling thoughts. Thanks for the spark, much appreciated.

Thanks, TechSlave - interesting comment. A denial of service attack on someone's clothing is sure to go into one of my stories unless Charlie gets there first.

Ken,

Feel free to use as you wish. Charlie may sneak it in though - he's tricky, that one. My comments tend to digress a bit without someone to actually impede the flow. But I greatly appreciate things which give inspiration to new or interesting trains of thought.


The trend towards reliance on bandwidth and communication availability for products as well as services will likely only rise. Especially when DRM licensing servers tether your music, and Google Apps doesn't like being offline very much (Google Gears aside).
I've noticed it also with the many operations that suddenly can no longer be easily completed over the phone or by fax. Its creeping into local government, and corporate management. Currently, the place I work would be (and is) halfway paralyzed when a Active Directory flub occurs. Not to mention when the horrible webapplication-using-Java payroll management system gives up the ghost for the day.

The trend is not 'all bad'. But a lot of assumptions are being made regarding the status of online access. Its become more of a utility/need for many people, and a critical infrastructure for business operations. Meanwhile, providers are looking to control people's usage rather than expand availability. And lets not mention the security paradigm and the innate hazard of security (especially on the level of virus, trojan, spyware, etc) being far and away more of a reactive than proactive concern.

Let us hope we get smart about the risks. I would love to see the engineerss working on these critical issues, rather than giving up and heading for financial sectors. But for those of us in the United States, higher education at all is becoming more and more of a stretch. The system of federal grant and loan structures is considerably overtaxed while the total cost of a degree is accelerating far past inflation and value of the dollar.
Here's hoping we don't end up with a new world of illiteracy problems, where the divide between the priesthood and the peon is the ability to understand the power and danger of infotech.

"Escape velocity from the gravity well requires (for now) the use of essentially irreplaceable fuels."

LOX/H2?

LOX/Alcohol?

What's irreplaceable?

John,

Are you saying you can reclaim the LOX/H2 mixture from a true 'renewable' source, such as sunlight or the (hopefully) not-likely-to-end wells of geothermal or wind?

However, if you meant to say "these are not fossil fuels". We use LOX, liquid hydrogen, hydrazine/oxygen, aluminum and ammonium perchlorate. These are not replaceable. Fortunately, they are not fossil fuels, so I do stand corrected.

My main point was really that we should be focusing our use of non-renewable fuels on functions they are absolutely necessary for. Even plastics, which are another petroleum based product, are being put to extremely wasteful use as the 'default' packaging material for everything in the consumer world.

Perhaps I made that part of my point badly.

Minor quibble: hydrazine proper is not used much these days, apart from "cold" attitude rockets. Its derivatives like monomethyl or unsymmetric dimethyl hydrazine - yes, sometimes blended with hydrazine - are far more common (and not usually burned with oxygen but with other oxidants like hydrogen peroxide or nitrogen tetroxide).

Plastics are not purely and inherently petrochemical, e.g. at the present time a major ingredient of nylon is sebacic acid produced from castor oil.

Since these fuels can be produced by electricity & electricity can be produced by nuclear power it depends on whether nuclear should count as a renewable. Since it can be kept going for at least the next 5 billion years if it doesn't count as such then "renewable" is merely an idealogical rather than engineering description.

It would be very nice if science-fiction writers were to stop fantasizing about what kinds of internet applications could be constructed if only packets could travel through the network at superluminal speeds. Some problems just can't really be hand-waved away by inverse multiplexing over mesh networks, and I'm getting pretty tired of seeing that trope played for more than it's worth.

Denial of service attacks on your clothing are so old school - think of man-in-the-middle attacks on your vision (or your augmented memory - shades of 1984). Woops - Vernor beat you to it, as well as invention of a new profession: code archaeology.

Ken, I didn't know that you have been discussing the failure of AI for TWENTY years. That's great! I've been having similar gripes since 1977. In that year I read a (so-called) psychology textbook called HUMAN INFORMATION PROCESSING. I got annoyed by its overwhelming neglect of physiology and its use of all sorts of AI notions AS IF they were based on solid results. A close reading (for as long as my patience lasted) showed that few if any were. I became suspicious immediately and a bit enraged at what I considered to be scientific charlatinism. I've been assured that real results, e.g. about chess-playing programs, have been attained recently. That's fine: but in most philosophy and science that's post-behaviorist, consciousness, say visual sensing, is considered to be up for grabs. Is it there? If so, how do you best describe it? What's its function? Is it computational? Is it necessary for intentional, intelligent action? etc. Nobody knows! For an excellent SF novel solidly based on all this, read Peter Watts' brilliant and intense BLINDSIGHT. You'll learn some cutting-edge biology and neuroscience.

OOPS! Ken, i am certain that you have read BLINDSIGHT and that you know lots of the cutting-edge and---often more important---durable science involved.

Your passing joke about Australia is already obsolete.
Someone sat down with a map of the United States and shaded the ares with earthquakes, floods, tornadoes, and a dozen other potential disasters, and built a giant fortress of backups -- in Las Vegas.

Tech Slave said:
>I mean, really, what do you do when you get hit with a denial of
>service attack on your clothing?

It's not exactly the same thing, but suddenly all I could think of was the removal of an holographic charater's pants in I think we're all bozos on this bus (Hugo-nominated, partially recycled from authentic Caltech compu-scratch):

Close B close mode on Deputy Dan!


The best predictors of the future are what we will desire and what will be possible---but not only are the two whats uncertain, but the answer to 'Whom do you mean by "we"?,' is never permanently definitive.

If the Austrian (Mises, Hayek, ...) school of economics is right, the current recession and coming economic debacle has its source in [too much] government meddling in money, rather than in too little of same.

If they are right, then each small bust is cured by laying the foundations for an even bigger one, and the one we are seeing now is pretty big. So what if it is indeed cured, or "cured" for some time? What comes next? And what will the cure have to be if it follows the same progression?

"All your banks are belong to us" will be the beginning of a fast transition to a planned economy. But just as Chinese capitalism is communism with lots of Chinese characteristics, that planned econmy will never be called such, but rather a "capitalism with Western democratic characteristics".

Interesting reversals ahead!

Our dear leader's current solution isto borrow another £500 billion. If the current fall in the £ & interest rates means nobody sane will lend they now appear to be thinking about just printing the stuff ("quantitative easing"). This does indeed rather suggest an even more spectacular crash.

The fact is the world as a whole is still growing at about 2%. This recession is entirely caused by government 7 they could end it instantly by just getting out of the way.

To expand on what Nedbrek mentioned, not only have we hit the wall in Ghz on CPU speed, a Sandia white paper shows that beyond 8 cores you don't get much performance boost. So not only is Moore's Law hitting a wall, it's hitting a wall pretty soon.

Reference to Sandia paper here: More Chips Cores Can Mean Slower Supercomputers, Report Says.

The limit on clock speeds gets imposed by physics. A 4 GHz CPU would run hotter than thermite, while a 10 GHz CPU would run hotter than the surface of the sun.

The ongoing claim by science fiction writers that Moore's Law will keep on going seems to fly in the face of the obvious evidence. PC speeds hit the wall around 2002, operating systems hit the wall -- all converging on more or less the same paradigm and the same functionality -- between the advent of Windows 2000 and OS X 10.2 and Ubuntu Edgy Eft. (I can make my Ubuntu 8.1 look almost exactly like OS X Tiger or Leopard by tweaking some conf files and adding an open source dock .deb) True AI crashed and burned, useful voice recognition crashed and burned, the frame problem was never solved, machine translation was never solved... So it's not obvious how computers will get faster or smarter or better. The goal is a DWIM interface -- "Do What I Mean," but there's no sign of that on the horizon.

As for improving human intelligence, no one has yet succeeded in defining intelligence, and as a rule of thumb, what you can't measure, you can't improve. Louis Terman, who did crucial work in developing IQ tests, gathered a group of super-high-IQ kids around him. These kids were supposed to achieve great things. He left out of the group two future Nobel prize winner, Luiz Alvarez and William Shockley, because their IQ scores just weren't high enough.

Current intelligence research seems to be roughtly at the level of astrology, so until (unless) that improves, it seems overoptimistic to hope for genetically engineered smarter humans.

If Moore's Law were to hit the wall it could reasonably be taken as evidence that we were reaching the upper half of the S curve of human progress. On the other hand so far we aren't & indeed the doubling time, which was originally 2 years sank to 18 months & may now be nearer 1 year.

When it slows I will believe it is slowing but predictions that it must start doing so soon are no more valid than those predictions aircraft couldn't be built large enough to carry 2 people for reasons which looked valid at the time.

Post a Comment


Home