Sometimes it feels as though the march of progress is all but unstoppable and innovation will not rest until we're all living it up in a virtual world, on our virtual yachts, sipping virtual pina coladas.
However, if the past has taught us anything, it's that technological advances can always be turned to atrocious use. In splitting the atom, we created nuclear weapons before nuclear power, who's to say we won't do the same with nanobots, virtual reality or the latest medical research?
As well as the potential for humans to kill and maim one another, there are also certain technologies that we should never attempt due to the sheer amount of accidental damage it will cause.
Many a sci-fi has begun with a well-meaning scientist meddling with powers he can not hope to control. Dr Frankenstein had meant to create the paragon of human perfection and instead created a monster. You can't always know which innovations are bad ones until they happen. After all, if you play with fire, you might get burnt, but you might also be able to keep yourself warm.
That said, there are a few future technologies that you just know are a bad idea from the word go.
8. Self-Replicating Tech
The idea of a non-biological self-replicating system is not new. John von Neumann proposed his concept for such a system back in 1948, detailing a machine that could use raw materials to build another version of itself and duplicate its programming into the new machine.
Granted, this was more of a thought experiment at the time, but now we are rapidly approaching the point at which our technology could make it a reality.
The benefits of self-replicating technology are certainly tempting. You would only have to manufacture a small amount yourself before the machine took over and costs of things like shipping would be slashed. The problem we have is one of handing over certain powers of judgement and creation to something with much simpler programming than us, particularly in the case of nanotechnology.
This is easily the kind of technology that you can lose control of and, even if the machines created are not harmful or dangerous themselves, the rapid consumption of resources in exponential replication could well lead you your robotics lab being disintegrated around you by a swarm of hungry nanobots. The "grey goo" scenario might be extreme, but not totally off the wall.
If the machines themselves are harmful, however, we run into a whole new set of problems...
7. Nanobot Weaponry
It can be argued that the increasing automation and "techifying" of warfare is saving lives. With robots and drones in the field, you don't need to put as many human lives on the line.
However, with nanotechnology moving further to the fore, do we need to draw a line somewhere?
Swarm technology would essentially be the 21st century equivalent of unleashing a plague of locusts - or a biological weapon - on the enemy. They could be used to block sunlight, destroy resources, jam communications and restrict movement - all without putting a single boot on the ground. Make them self-replicating and there is virtually no defence.
They cannot be shot at or bombed, they cannot be disabled or captured, they act more like an infectious disease than a weapon. What's more, the problem with a swarm is that it doesn't discriminate.
6. Automated War Drones
The thing about putting humans in war zones, is that they can make complex judgement calls. The other thing about war is that, although you want to keep your troops as safe as possible, the risk to human life can act as a temper and prevent all-out chaos.
Lethal drones, operated remotely with a real human pushing buttons are one thing, but they're resource heavy. A robot than can select a target, aim its gun and fire without human intervention certainly frees up some manpower, but who is held responsible for its actions?
Then consider the arms race. One side sends in automatons, the other side has no choice but to follow suit or be decimated, human soldiers gradually withdraw from the battlefield, leaving robots to fight robots and any civilians that will be caught in the crossfire.
The real kicker is, without the caution that comes with sending human troops into battle, you can take more and greater risks, you can apply more deadly force and you can do it all safe in the knowledge that you won't lose a single man.
5. The Alcubierre Warp Drive
In our minds, the distant future tends to hold at least a hopeful hint of intergalactic travel. Unfortunately, the sheer distances involved are threatening to limit us to our own solar system and no further, unless we can somehow crack Faster Than Light travel.
FTL travel is impossible in linear space, but the warp drive - dreamt up in sci-fi and backed by science - has always offered a little glimmer of hope. Rather that travelling through space at face-melting speeds, it works by bending spacetime around it and bringing your destination to you.
It was all looking like a legitimate, if far-off, possibility, butil a team of physicists in Italy found a fatal flaw.
When you fire up a warp drive, high energy particles that occur throughout the universe get caught up in its warp field. This isn't a problem for travelling, the issue is that stopping at any point will release the particles and could destroy whole star systems - possibly even generating black holes.
Not exactly the best way to make an entrance.
4. Virtual Prisons
When discussing the potentials of "mind upload" technology, people often focus on virtual holidays, or even virtual lives. People discuss the potential to upload our minds into computers, or even duplicate them and keep a backup of our very selves.
What people don't often discuss, is the potential this kind of technology has in the world of crime and punishment. Specifically, in the capabilities for time manipulation that this would offer us.
A virtual mind in a simulated world could be made to experience time in any way we like. We could make an hour pass in a minute, or stretch an hour out over 30 years. This would mean that, rather than taking up cell space and using up valuable resources in real-time, a criminal could serve a 100 year sentence in a day.
Not only would this be one hell of a deterrent, but we would be able to apply a rigorous course of rehabilitation in a matter of minutes. It would also be enough to crack anyone's mind.
3. Gene Sequencing For Pathogens
So, we now live in a world in which the details of how to make mutant bird flu has been published in the public domain.
This is obviously in the pursuit of a greater understanding (and therefore cure for) of deadly diseases, but sequencing and altering the genomes of pathogens then publishing the results could land mankind in a lot of hot water.
Whilst the information is intended for noble use, there are those who could do a lot of damage with that information. Bird flu, for example, is currently unlikely to spread from human to human, but if someone were to make it airborne, then genetically engineered biological weapons are on the cards.
2. Electronic Telepathy
Mind-to-mind communication is one of those sci-fi technologies that actually might not be as far off as you'd think, with some experts estimating that the technology could be usable in the next 20 years.
This near-future technology will undoubtedly be rudimentary, but we would still have to consider the consequences of opening our brains up to the outside world. If another person can gain access to your brain to send a message, they can potentially get in there for more nefarious reasons.
If you have a chip in your brain that allows other to gain access, there is the potential for you to be hacked. Whether that simply involves some kind of intrusive "brain spam", or a takeover, altering memories, emotions and values. Depending on how you're hooked up, they might even be able to control you physically.
And that's just the lone hackers, imagine what this kind of technology could be like in the hands of a controlling government.
1. Conscious Machines
It's almost taken for granted that, at some point in the distant future, we will develop the technology to create consciousness in computing. This, according to some, would be wildly unethical.
Creating a machine that can mimic consciousness, with programming so complex that they appear to be thinking and feeling is one thing. You can programme a computer to converse with you, gauge your mood, offer emotional support and even "recognise" itself in a mirror, all without it being actually conscious or self-aware.
Building a conscious mind inside a computer would be tantamount to cruelty. For a start, we don't know how a consciousness would respond to awareness as a machine. We know how we feel as conscious humans, but there's nothing to say that the two experiences would be in any way similar.
The nature of progress would also mean that early attempts at consciousness would inevitably be flawed. If somebody told you that they were going to purposefully engineer mentally disabled humans, you'd be horrified, and this would be no different.
You then have to consider what we would want a conscious machine for. The benefit of computers is that they can perform difficult, menial tasks without becoming bored or dissatisfied and without any need for compensation or reward. To continue to expect that service from a self-aware intelligence would be the very definition of cruel and unusual..
However, if the past has taught us anything, it's that technological advances can always be turned to atrocious use. In splitting the atom, we created nuclear weapons before nuclear power, who's to say we won't do the same with nanobots, virtual reality or the latest medical research?
As well as the potential for humans to kill and maim one another, there are also certain technologies that we should never attempt due to the sheer amount of accidental damage it will cause.
Many a sci-fi has begun with a well-meaning scientist meddling with powers he can not hope to control. Dr Frankenstein had meant to create the paragon of human perfection and instead created a monster. You can't always know which innovations are bad ones until they happen. After all, if you play with fire, you might get burnt, but you might also be able to keep yourself warm.
That said, there are a few future technologies that you just know are a bad idea from the word go.
8. Self-Replicating Tech
The idea of a non-biological self-replicating system is not new. John von Neumann proposed his concept for such a system back in 1948, detailing a machine that could use raw materials to build another version of itself and duplicate its programming into the new machine.
Granted, this was more of a thought experiment at the time, but now we are rapidly approaching the point at which our technology could make it a reality.
The benefits of self-replicating technology are certainly tempting. You would only have to manufacture a small amount yourself before the machine took over and costs of things like shipping would be slashed. The problem we have is one of handing over certain powers of judgement and creation to something with much simpler programming than us, particularly in the case of nanotechnology.
This is easily the kind of technology that you can lose control of and, even if the machines created are not harmful or dangerous themselves, the rapid consumption of resources in exponential replication could well lead you your robotics lab being disintegrated around you by a swarm of hungry nanobots. The "grey goo" scenario might be extreme, but not totally off the wall.
If the machines themselves are harmful, however, we run into a whole new set of problems...
7. Nanobot Weaponry
It can be argued that the increasing automation and "techifying" of warfare is saving lives. With robots and drones in the field, you don't need to put as many human lives on the line.
However, with nanotechnology moving further to the fore, do we need to draw a line somewhere?
Swarm technology would essentially be the 21st century equivalent of unleashing a plague of locusts - or a biological weapon - on the enemy. They could be used to block sunlight, destroy resources, jam communications and restrict movement - all without putting a single boot on the ground. Make them self-replicating and there is virtually no defence.
They cannot be shot at or bombed, they cannot be disabled or captured, they act more like an infectious disease than a weapon. What's more, the problem with a swarm is that it doesn't discriminate.
6. Automated War Drones
The thing about putting humans in war zones, is that they can make complex judgement calls. The other thing about war is that, although you want to keep your troops as safe as possible, the risk to human life can act as a temper and prevent all-out chaos.
Lethal drones, operated remotely with a real human pushing buttons are one thing, but they're resource heavy. A robot than can select a target, aim its gun and fire without human intervention certainly frees up some manpower, but who is held responsible for its actions?
Then consider the arms race. One side sends in automatons, the other side has no choice but to follow suit or be decimated, human soldiers gradually withdraw from the battlefield, leaving robots to fight robots and any civilians that will be caught in the crossfire.
The real kicker is, without the caution that comes with sending human troops into battle, you can take more and greater risks, you can apply more deadly force and you can do it all safe in the knowledge that you won't lose a single man.
5. The Alcubierre Warp Drive
In our minds, the distant future tends to hold at least a hopeful hint of intergalactic travel. Unfortunately, the sheer distances involved are threatening to limit us to our own solar system and no further, unless we can somehow crack Faster Than Light travel.
FTL travel is impossible in linear space, but the warp drive - dreamt up in sci-fi and backed by science - has always offered a little glimmer of hope. Rather that travelling through space at face-melting speeds, it works by bending spacetime around it and bringing your destination to you.
It was all looking like a legitimate, if far-off, possibility, butil a team of physicists in Italy found a fatal flaw.
When you fire up a warp drive, high energy particles that occur throughout the universe get caught up in its warp field. This isn't a problem for travelling, the issue is that stopping at any point will release the particles and could destroy whole star systems - possibly even generating black holes.
Not exactly the best way to make an entrance.
4. Virtual Prisons
When discussing the potentials of "mind upload" technology, people often focus on virtual holidays, or even virtual lives. People discuss the potential to upload our minds into computers, or even duplicate them and keep a backup of our very selves.
What people don't often discuss, is the potential this kind of technology has in the world of crime and punishment. Specifically, in the capabilities for time manipulation that this would offer us.
A virtual mind in a simulated world could be made to experience time in any way we like. We could make an hour pass in a minute, or stretch an hour out over 30 years. This would mean that, rather than taking up cell space and using up valuable resources in real-time, a criminal could serve a 100 year sentence in a day.
Not only would this be one hell of a deterrent, but we would be able to apply a rigorous course of rehabilitation in a matter of minutes. It would also be enough to crack anyone's mind.
3. Gene Sequencing For Pathogens
So, we now live in a world in which the details of how to make mutant bird flu has been published in the public domain.
This is obviously in the pursuit of a greater understanding (and therefore cure for) of deadly diseases, but sequencing and altering the genomes of pathogens then publishing the results could land mankind in a lot of hot water.
Whilst the information is intended for noble use, there are those who could do a lot of damage with that information. Bird flu, for example, is currently unlikely to spread from human to human, but if someone were to make it airborne, then genetically engineered biological weapons are on the cards.
2. Electronic Telepathy
Mind-to-mind communication is one of those sci-fi technologies that actually might not be as far off as you'd think, with some experts estimating that the technology could be usable in the next 20 years.
This near-future technology will undoubtedly be rudimentary, but we would still have to consider the consequences of opening our brains up to the outside world. If another person can gain access to your brain to send a message, they can potentially get in there for more nefarious reasons.
If you have a chip in your brain that allows other to gain access, there is the potential for you to be hacked. Whether that simply involves some kind of intrusive "brain spam", or a takeover, altering memories, emotions and values. Depending on how you're hooked up, they might even be able to control you physically.
And that's just the lone hackers, imagine what this kind of technology could be like in the hands of a controlling government.
1. Conscious Machines
It's almost taken for granted that, at some point in the distant future, we will develop the technology to create consciousness in computing. This, according to some, would be wildly unethical.
Creating a machine that can mimic consciousness, with programming so complex that they appear to be thinking and feeling is one thing. You can programme a computer to converse with you, gauge your mood, offer emotional support and even "recognise" itself in a mirror, all without it being actually conscious or self-aware.
Building a conscious mind inside a computer would be tantamount to cruelty. For a start, we don't know how a consciousness would respond to awareness as a machine. We know how we feel as conscious humans, but there's nothing to say that the two experiences would be in any way similar.
The nature of progress would also mean that early attempts at consciousness would inevitably be flawed. If somebody told you that they were going to purposefully engineer mentally disabled humans, you'd be horrified, and this would be no different.
You then have to consider what we would want a conscious machine for. The benefit of computers is that they can perform difficult, menial tasks without becoming bored or dissatisfied and without any need for compensation or reward. To continue to expect that service from a self-aware intelligence would be the very definition of cruel and unusual..
8 Technologies That We Should Never, Ever Build
4/
5
Oleh
Unknown