-
Sun Devil
Singularity
Been reading about this concept of singularity. Kinda freaks me out.
From Wiki:
http://en.wikipedia.org/wiki/Technological_singularity
1. That a technological-evolutionary point known as "the singularity" exists as an achievable goal for humanity (the exact nature of the point is an arbitrarily high level of technology).
2. That through a law of accelerating returns, technology is progressing toward the singularity at an exponential rate.
3. That the functionality of the human brain is quantifiable in terms of technology that we can build in the near future.
4. That medical advancements could keep a significant number of his generation (Baby Boomers) alive long enough for the exponential growth of technology to intersect and surpass the processing of the human brain.
Do you think The Singularity Is Near?
-
Hood Rich
Had a friend that was really into this concept. He would always say, "if we can just survive until then, we can live forever!"
It seems to me that it's fundamentally flawed. Not all "breakthroughs" are equal. Just because you can make a chart and place them on there in an impressive looking manner does not mean that any certain breakthroughs will occur. So, it makes little sense to say that, because breakthroughs have occured at this rate over history, we can predict a specific breakthrough occuring in the future.
"We don't estimate speeches." - CBO Director Doug Elmendorf
-
supervillain
-
Senior Member
Set your watch for 2029 (apparently)
I watched an interesting show on stuff like this last week. Quantum computers are scary.
-
Flashkit historian
I thought this was one of those dating threads.
-
....he's amazing!!!
It's not about us, its about technology, and the magic date is the one where we build AI that can build AI as well as we can, then we can kiss our asses goodbye.
Last edited by lesli_felix; 04-15-2008 at 08:58 AM.
-
Flashkit historian
How can technology not be about us?
-
Total Universe Mod
Some speculate superhuman intelligences may have goals inconsistent with human survival and prosperity. AI researcher Hugo de Garis suggests AIs may simply eliminate the human race, and humans would be powerless to stop them.
How interesting would it be if they just demolished any structure that could be considered a place of worship?
-
....he's amazing!!!
Originally Posted by Frets
How can technology not be about us?
It is... until it surpasses us.
-
Wait- what now?
Why would we ever need an intelligence this smart? If one was made it would obviously have low capabilities, "robots" surely would be kept in two distinct groups of the thinkers and the doers. If we need a pc which can predict the expansion of the universe, we create a thinker with no motors or capabilities to launch missiles etc and I highly doubt we would interlink it with other robots. Then you would have the kind of bots who clean your house, a do-er. The mistake in iRobot was that they let the big thinking robot communicate directly to the working robots. Tsk tsk.
"I'd only told them the truth. Was that so selfish? Our integrity sells for so little, but it is all we really have. It is the very last inch of us, but within that inch, we are free."
-
Flashkit historian
Somebody spends too much time watching The Matrix.
Humans create technology. If the technology we create destroys us it is about us.
-
....he's amazing!!!
Originally Posted by Frets
Somebody spends too much time watching The Matrix.
Somebody thinks being patronising sounds cool. It doesn't.
Most of the ideas about AI and technology in the Matrix have been around for years, many borrowed from other science fiction, many from scientific speculation.
Originally Posted by Frets
Humans create technology. If the technology we create destroys us it is about us.
Now you're being philosophical. I'm not sure the machines we build will share your sentiments.
Originally Posted by Tidenburg
Why would we ever need an intelligence this smart? If one was made it would obviously have low capabilities, "robots" surely would be kept in two distinct groups of the thinkers and the doers. If we need a pc which can predict the expansion of the universe, we create a thinker with no motors or capabilities to launch missiles etc and I highly doubt we would interlink it with other robots. Then you would have the kind of bots who clean your house, a do-er. The mistake in iRobot was that they let the big thinking robot communicate directly to the working robots. Tsk tsk.
Sure, tidenburg. If you were the arbiter of what technology we do and don't develop, then we'd all be safe. I'm sure you think we don't need malicious viruses either, lets hope nobody ever tries to write one.
Unfortunately you're both looking at the possibility of sensible, logical humans building terrible and destructive machines. It's the possibility of terrible, destructive humans building sensible and logical machines that we need to be wary of.
Oh and Frets.... I think it's.... inevitable.
-
Wait- what now?
Heh, Frets i've only ever watched the matrix through once I just think in most films (yes, I know films and reality differ greatly) the key error is always that they let the big power robots take orders from the huge robot intelligence. And I know I wouldn't be any good as "the arbiter of what technology we do and don't develop" I was just stating an observation
I vote for google to create the first all knowing supercomputer (and it has to talk back to you, it doesn't have the same effect otherwise) they probably have some form of access to a huge chunk of the worlds information now anyways.
"I'd only told them the truth. Was that so selfish? Our integrity sells for so little, but it is all we really have. It is the very last inch of us, but within that inch, we are free."
-
Total Universe Mod
Forget about all the parts of our mortality or the thought of it that makes these guesses so compelling. What level of guilt do we admit to by even suggesting that as a possibility? Aren't we basically just saying, "we know we suck."? I'd love to tangent off about how we should be so willing to admit the other areas we suck at but I'll try to stay on topic.
Which of you developers out there actually see some code running so badly? Now I've never worked on a large team of developers but I find it hard to imagine that anyone involved in a project large enough to produce such an anomaly would not notice the potential along the way. Surely theres enough sci-fi available to make it a concern. Hell I even wrote a short story of my own based on it in high school. Oddly, it's still the only plausible scenario for such an event and far fetched at that.
People are not as stupid as their governments make them out to be. We should have more faith in each other.
-
Juvenile Delinquent
Originally Posted by lesli_felix
Unfortunately you're both looking at the possibility of sensible, logical humans building terrible and destructive machines. It's the possibility of terrible, destructive humans building sensible and logical machines that we need to be wary of.
Cases of both already exist
-
poet and narcisist
Originally Posted by Frets
Somebody spends too much time watching The Matrix.
Now if you go watch Ghost in the Shell, it presents way better concepts, it's overall better, and it rules.
-
Flashkit historian
jaquan-
Seeing the looming irregularity and doing something about it are often quite different.
Look at the great depression. It was not only the stock market crash it was also bad ecomomic decisions made at the time.
Or for something more recent look at Sony's rootkit story
http://en.wikipedia.org/wiki/2005_So...ection_scandal
Knowing what the options were they made a bad decision. Which hurt them more then helped them. Afterwhich they made another bad decision.
Programmers aren't in the ethics business. Niether are thier employers.
If a programmer has an ethical dilema the company will often replace the programmer for the ideal. However usually programmers are more concerned with the more immediate of "I want to try this because I think I can do it"
Rather then the long term of what impact on society will this bring.
And with the plethora of programmers seeking new challenges. It is rather easy to replace one with ethical limitations for another without one.
-
Total Universe Mod
Now who's been watching too many movies? jk
just saying that the more capable CEO's, while bound to their investors, would still try to think long term. And with the process involved in rolling something out to the public, I doubt a would be culprit would get much further than an IPO.
Still, the possibility remains. It wouldn't be as glamorous as an army of bipeds marching down the street (such an arrogant human concept) so much as a quick poisoning of the water supply and a few WMD's.
-
Flashkit historian
I don't need movies or TV Heck most if it is plagerised from history and current events.
-
Total Universe Mod
South Park drew a great link between the loss of the internet and the grapes of wrath last night.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|