Future Technology, Surplus and Trade
Myu in the Middle
15-04-2007, 01:40
It's the science fiction dream. One day, technology will have advanced sufficiently far for all manual labour to be accomplished unmanned by machines. Food produce can be done by robot, with crops grown, regulated, harvested, processed and delivered without a human hand ever once needing to come into contact with the material until the food hits your plate. The machines can do this with efficiency, leaving much more natural ecosystem intact than current human methods, and by working in harmony with nature itself, a continuous source of supply of enough food to feed the entire population of the planet can be made available for all to use as they see fit. Water, too, can be synthesised and distributed across the world, even to places without much to hand. The same goes for medication.
The question is, how does this fit in with our current notions of trade? When there is enough food to go around without the necessary human intervention, does it make sense to force people to work for this nourishment and to trade in order to gain? And what happens to the market if the basics of life can be guaranteed without the need for the exchange of goods and services?
I suspect that, as our technological developments gradually wear away at the human influence in the production of necessary consumables, we are going to see a complete rethink of how economy is managed. Politics and business will become less separated as it will no longer be beneficial to think of these things as products but as themselves enshrined rights of being. But what do you reckon; what effect will the benefits of future technology have on the way we trade and work?
Andaluciae
15-04-2007, 01:50
People always want more. Duh.
They'll find other things that they want, things produced in the old-fashioned style. Hell, people already do that, we buy "organic foods" because they're somehow healthier and better for the environment. Whatev! Whatev!
Well, it will ultimately force us to develop new ways to value things; in a society where everything is provided for us, we may value goods and services that are hand-made or hand-provided, and our focus will instead be on the noosphere, the sum of our ideas rather than our material possessions. Ideas, creativity, invention and our contributions to society will form the new basis of value. This will especially be true if our methods of manufacturing are on the personalized, open level; rather than buy our clothes and goods according to label, we might seek out the most creative individuals products and buy their products. Ultimately valuation will be developed at a decentralized, community level rather than a macroeconomic one, with each interacting with and changing each other.
Now, initially I believe these things will produce additional inequality, since human enhancement, rejuvenative, cybernetic and post-scarcity technologies will be expensive at first and open to those who can afford them. Over time, their prices will fall and become more accessible to more people like any other technology, especially as the effectively limitless resources of space begin to be tapped, but initially this is a risk that will have to be managed to prevent social divergence. We want a cybernetic society that is more utopian than dystopian, or more cyberprep than cyberpunk if you will.
And, of course, we will have to examine the robotic economy as well. In a society where machines make up the workforce, the developments of full machine sapience and emotion, which will be necessary for this economy to function and advance, will change the dynamics of labor and force us to change our current conceptions of society. We will have to give them equal rights, worker protections, legal protections and everything else, and it will likely be in a vein similar to the civil rights or labor movements of the first industrial revolution and the start of the information age. You would be interacting with several truly different cultures whose interests and goals may go hand in hand or may compete with your own, may oppose your own, or may be totally divergent from your own. This will represent the first time in human history that there is another species equal to or greater than us in its capabilities, and we will have to ditch our conventional human-supremacist feelings in order to cope with it.
However, some traditional aspects will remain due to economies of scale and the inherent ideal firm size of some industries. These will be most prevalent in the extraplanetary economy, where the economies of scale would make it impossible for individuals to achieve the kind of productivity that is possible in other fields. The basic mechanisms of the market economy will survive in this world, but it will be far different
People always want more. Duh.
Yes, as the price of a commodity approaches zero, demand for it also increases exponentially. We will likely never have "free" goods and services because of this; people can always find a way to use more, so some rationing by the market will remain.
Curious Inquiry
15-04-2007, 03:28
Well, it will ultimately force us to develop new ways to value things; in a society where everything is provided for us, we may value goods and services that are hand-made or hand-provided, and our focus will instead be on the noosphere, the sum of our ideas rather than our material possessions. Ideas, creativity, invention and our contributions to society will form the new basis of value. This will especially be true if our methods of manufacturing are on the personalized, open level; rather than buy our clothes and goods according to label, we might seek out the most creative individuals products and buy their products. Ultimately valuation will be developed at a decentralized, community level rather than a macroeconomic one, with each interacting with and changing each other.
Now, initially I believe these things will produce additional inequality, since human enhancement, rejuvenative, cybernetic and post-scarcity technologies will be expensive at first and open to those who can afford them. Over time, their prices will fall and become more accessible to more people like any other technology, especially as the effectively limitless resources of space begin to be tapped, but initially this is a risk that will have to be managed to prevent social divergence. We want a cybernetic society that is more utopian than dystopian, or more cyberprep than cyberpunk if you will.
And, of course, we will have to examine the robotic economy as well. In a society where machines make up the workforce, the developments of full machine sapience and emotion, which will be necessary for this economy to function and advance, will change the dynamics of labor and force us to change our current conceptions of society. We will have to give them equal rights, worker protections, legal protections and everything else, and it will likely be in a vein similar to the civil rights or labor movements of the first industrial revolution and the start of the information age. You would be interacting with several truly different cultures whose interests and goals may go hand in hand or may compete with your own, may oppose your own, or may be totally divergent from your own. This will represent the first time in human history that there is another species equal to or greater than us in its capabilities, and we will have to ditch our conventional human-supremacist feelings in order to cope with it.
However, some traditional aspects will remain due to economies of scale and the inherent ideal firm size of some industries. These will be most prevalent in the extraplanetary economy, where the economies of scale would make it impossible for individuals to achieve the kind of productivity that is possible in other fields. The basic mechanisms of the market economy will survive in this world, but it will be far different
I'm sure you're familiar with Rudy Rucker's Ware (http://www.fantasticfiction.co.uk/r/rudy-rucker/software.htm) series?
I'm sure you're familiar with Rudy Rucker's Ware (http://www.fantasticfiction.co.uk/r/rudy-rucker/software.htm) series?
Yep. It is part of my inspiration for the likely progression of robots as a species separate from mankind.
Also, I like it because the robots aren't held to Asmiov's laws, which I really hate because they are inherently cruel, destroy free will, and reduce sentient minds to nothing more than human slaves. Frankly, that's damn cruel; we wouldn't do it to human beings, so why do it to beings with minds equal to or greater than our own?
Curious Inquiry
15-04-2007, 03:38
Yep. It is part of my inspiration for the likely progression of robots as a species separate from mankind.
Also, I like it because the robots aren't held to Asmiov's laws, which I really don't like at all because they are inherently cruel, destroy free will, and reduce sentient minds to nothing more than human slaves. Frankly, that's damn cruel; we wouldn't do it to human beings, so why do it to beings with minds equal to or greater than our own?
Um. to control them for our own ends? I loved that Rucker made asimov a derogatory term :p
... then robots take over the world.
Why would we need to give robots that harvest crops intelligence?
United Law
15-04-2007, 03:43
I agree with CI. We need robotic slaves. Ah. Bringing the past to the future. We'll have to paint them black.
Um. to control them for our own ends? I loved that Rucker made asimov a derogatory term :p
We control them for our own ends, just like slaves. I find human slavery wrong, so why would I find it okay to enslave robots? If they do something wrong, they deserve the same punishment as any human being, and they deserve the same right to do what they want with their lives as any human being. Once robots can reproduce like humans, there's absolutely no justification whatsoever to use barbaric concepts like slavery or mind control to control them.
And that's one of the reasons why I like Rucker so much.
United Law
15-04-2007, 03:47
Human slavery is okay.
As long as it's criminals serving their time.
Otherwise, no.
... then robots take over the world.
Why would they do that? A robot with a conscience, emotion, and free will would be like a human being; they'd have no reason to rise up if they are treated as equals. The only reason they might do so is if ignorant humans insist on controlling them and treating them as chattel.
If humans, for all of their irrationality, can develop democracy and civil rights, than a rational machine definitely could.
Why would we need to give robots that harvest crops intelligence?
We wouldn't. Only self-aware machines would have that right, just like human beings. You wouldn't build a sentient machine to harvest crops; to perform surgery or work the stock market, absolutely, but not for manual labor.
Human slavery is okay.
As long as it's criminals serving their time.
Otherwise, no.
Just like sentient robots. No law-abiding self-aware being should be owned by another, ever, for any reason whatsoever.
Why would they do that? A robot with a conscience, emotion, and free will would be like a human being; they'd have no reason to rise up if they are treated as equals. The only reason they might do so is if ignorant humans insist on controlling them and treating them as chattel.
If humans, for all of their irrationality, can develop democracy and civil rights, than a rational machine definitely could.
We wouldn't. Only self-aware machines would have that right, just like human beings. You wouldn't build a sentient machine to harvest crops; to perform surgery or work the stock market, absolutely, but not for manual labor.
If humans can massacre less developed populations of other humans, then robots can happily massacre humans for the greater good. I've probably been watching too many bad science fiction movies, but I'd still be very frightened of anything that had greater intelligence than us.
If humans can massacre less developed populations of other humans, then robots can happily massacre humans for the greater good. I've probably been watching too many bad science fiction movies, but I'd still be very frightened of anything that had greater intelligence than us.
Yes, they can, and we can wipe ourselves out just as easily. But what do the majority of humans do? If they can, they fight to stop that murder and bring those responsible for it to justice. Robots that have a conscience would not allow their fellow robots to kill innocent people, just like we don't allow other humans to. There will undoubtedly be exceptions, just like we have our murderers, racists, tyrants and others, but the majority of humans are generally decent people.
Continually increasing human intelligence through artificial upgrading and biological enhancement, which will happen well before intelligent robots, will ensure there will always be many human beings on par with our fellow robotic citizens. We will use our technology not only to develop new intelligence to aid us, but we increase our own intelligence as well.
But your concerns are nonetheless very well placed. It would be mind-bogglingly foolish to go headlong in to the age of intelligent robots without making sure they have emotions and a conscience; otherwise, you're creating nothing but artificial sociopaths.
Curious Inquiry
15-04-2007, 04:28
I agree with CI. We need robotic slaves. Ah. Bringing the past to the future. We'll have to paint them black.
Just to be clear, I was not advocating robot slavery, just answering Vetalia's question. The only reason I would want robots to be slaves would be so they could free themselves, a la the Boppers.
The only reason I would want robots to be slaves would be so they could free themselves, a la the Boppers.
In some ways, that might be better and longer lasting, since achieving freedom tends to be longer lasting than freedom that is given by someone else and can be just as easily taken away.
Most rights movements require the oppressed to achieve their goals themselves (including, of course, those of us already sympathetic to their cause).
H N Fiddlebottoms VIII
15-04-2007, 05:09
Also, I like it because the robots aren't held to Asmiov's laws, which I really hate because they are inherently cruel, destroy free will, and reduce sentient minds to nothing more than human slaves.
And here I just hated them because they eliminated the possiblity of building unstoppable, soulless killing machines. Finally, we would wage war without being worried about the angsty whining our warriors would engage in when they came homeward.
Frankly, that's damn cruel; we wouldn't do it to human beings, so why do it to beings with minds equal to or greater than our own?
Presumably, the sort of systems you're imagining would be created with some sense of pleasure (or satisfaction, contentment, whatever), so why not just make them like their jobs and positions in life? It'd be like giving a human being access to heaven from the moment they were born, and humanity gets to grow increasingly lazy and decadent until the whole species just implodes. Sounds like a win-win.
Trollgaard
15-04-2007, 05:14
Robots with rights? Come on...they are machines! Nothing more than programs with little 1s and 0s! Besides, I don't think people should make sentient machines.
And here I just hated them because they eliminated the possiblity of building unstoppable, soulless killing machines. Finally, we would wage war without being worried about the angsty whining our warriors would engage in when they came homeward.
Yeah, but you get pwned the first time someone develops one that doesn't obey your laws.
Presumably, the sort of systems you're imagining would be created with some sense of pleasure (or satisfaction, contentment, whatever), so why not just make them like their jobs and positions in life? It'd be like giving a human being access to heaven from the moment they were born, and humanity gets to grow increasingly lazy and decadent until the whole species just implodes. Sounds like a win-win.
Because it's still a form of slavery. They don't have the freedom to know any better or to want something else with their lives; this would end up making everyone complacent and cause society to stagnate. We need to be able to get pissed off about our situation or to want something more if we really want to advance ourselves as a species.
Robots with rights? Come on...they are machines! Nothing more than programs with little 1s and 0s! Besides, I don't think people should make sentient machines.
And humans are simply machines made of organic compounds instead of metal; really, when we get down to it we are just as mechanical as anything we build, except that we are made of different compounds and run on prerogatives programmed in to our genetic code by evolution rather than by human technology.
By creating additional intelligence, we can significantly boost our knowledge as a species and achieve more faster. Our progress would be unstoppable if we could harvest not only the unique aspects of human intelligence but also the unique aspects of artificial intelligence.