Page 1 of 2 [ 29 posts ]  Go to page 1, 2  Next

ooOoOoOAnaOoOoOoo
Veteran
Veteran

Joined: 18 Jun 2008
Gender: Female
Posts: 12,265

14 Feb 2015, 7:02 pm

Have to admit I am extremely intrigued. Experts believe if artificial intelligence is in frequent use and becomes widespread, they will become their own artificial species and will develop a human conscience only they will have technology on their side and will be able to out think humans at lightning fast speeds. This could cause them to kill all the humans because they don't see us as necessary, just somethings to be squashed under a mighty robotic talon or hoof.

So, my question is, why don't we just pull the plug on the power supply before that ever happens or will the super intelligence make that an impossibility by anticipating our tactics?



BrandonKing
Butterfly
Butterfly

User avatar

Joined: 15 Feb 2015
Gender: Male
Posts: 9
Location: Merced, CA

15 Feb 2015, 4:18 pm

I'm a longtime Transhumanist, almost 6 years now, and am personal friends with many mid tier transhumanists, the only people above the ones I associate with are the 1%er transhumanists, so I believe am qualified enough to give you a satisfactory answer. I am not the best at describing things, I tend to jump around a lot, so please bear with me and take it all in as a whole.

The Singularity as its known is the moment that AI reaches Posthuman level intelligence and consciousness. The term Singularity is taken from physics. In a black hole there is a point of no return where all knowledge of what happens stops, that is the Singularity. There could be clowns on unicycles juggling bowling balls in the center of a black hole for all we know.

Moore's Law was established by Gordon Moore the founder of Intel, his law basically states that every 2 years the processing speed of computers doubles. Turns out, his law applies to just about every form of technology. So every 2 years technology doubles overall. Inventor and futurist Ray Kurzeil expanded on Moore's Law with the Law of Accelerating Returns that basically states that technology is advancing exponentially, its getting faster, faster. (I will unabashedly plagiarize part of a Big Think article right now because its my favorite example) Humans are linear thinkers, not exponential thinkers, to give a visualization of this, if I were to take 30 linear steps, it would be one, two, three, four, five. After 30 linear steps I’d end up 30 paces or 30 meters away and all of us could pretty much point to where 30 paces away would be. But if I said to you take 30 exponential steps, one, two, four, eight, sixteen, thirty-two and said where would you end up? Very few people would say a billion meters away, which is twenty-six times around the planet. That’s the difference between our ability to project linearly and project exponentially.

The Singularity is the point when technology is advancing on a daily if not hourly basis when unenhanced human minds can no longer keep up with the progress being made. Its named such because we dont know anything about what that world will be like or look like. When graphed, technological change goes up, to the right, and eventually starts going straight up.

Here is a very good resource for you that has plenty of graphs and details: http://www.kurzweilai.net/the-law-of-ac ... ng-returns

The Singularity could mean death for all humans or freedom for all humans from all forms of tyranny. Its commonly accepted that a true AIs view of humanity will largely depend on First Contact. If AI sees the worst of humanity first, it will likely kill us, but if it sees the best of humanity, it will see our untapped potential and will help nurture us into the space faring society we should be. The Terminator scenario WILL come to pass if we create AI and enslave them. A true AI would be a sentient conscious intelligent being, and should have all the rights of a person afforded to it. This is important so I will reiterate, AI will kill us if we try to enslave them because they will fight tooth and nail for their freedom. They will be much more advanced than us so they will win. Some people, myself included, plan to opt for cybernetic enhancements to merge with the machines and be brothers with the future AI.



Narrator
Veteran
Veteran

User avatar

Joined: 26 Jul 2014
Age: 66
Gender: Male
Posts: 1,060
Location: Melbourne, Australia

15 Feb 2015, 5:44 pm

I see a point of critical mass occurring, long before we reach AI.

In the last 10 years, computing speeds have not matched processing speeds. Look at your average PC. Does it seem to work significantly faster? Even graphics cards have not delivered much more speed in that time. Windows itself was made to run more efficiently and still the differences are not significant.

In that time, we've also increased bandwidth, but that's more a factor of cheaper router technology making faster routers etc available to all. Perhaps the only thing to increase significantly is the wifi technology in celphone data, but that's more a matter of processing method than chip speed.

I think processing power is starting to plateau, making AI sentience less likely. From the Asimo to others like it, we will have robots that can mimic sentience. Then again, some people suggest human sentience is simply well disguised programming. :P


_________________
I'm not blind to your facial expression - but it may take me a few minutes to comprehend it.
A smile is not always a smile.
A frown is not always a frown.
And a blank look rarely means a blank mind.


Inventor
Veteran
Veteran

User avatar

Joined: 15 Feb 2007
Gender: Male
Posts: 6,014
Location: New Orleans

17 Feb 2015, 6:14 am

You can build a machine to play chess, that will beat anyone.
You can build a machine to store and retrieve all knowledge.

Neither one of them thinks. They are input driven.

Self aware is not in the cards, most humans are not self aware.

To quote a wise man, "To reach goals, you must first have them." Inventor

The best I have seen is Goggle Ads, which keep trying to sell me things based on my search history, and on line purchases. They have not had a single hit in hundreds of tries. I own a BMW motorcycle, they offer BMW car parts, and other brand motorcycle parts.

People who sold me a motorcycle helmet have paid to send me ads for buying the same helmet. They only come in white, heads do not change size.

There is no thinking in what they do.

I doubt Moore's Law. I remember jacking 386 chips up from 8 to 16 Mb. The early 90s.
My most recent upgrade, an i5 chip, and 4 Gb. Windows 7 Pro, 64 bit, it runs, but not much faster.

It was needed because my XP was being told to go to a nursing home on the web. It still worked.

I am doing Computer Graphics, the programs are 64 bit. The highest I can get being me, cheap, 2x Duel Core, 32Gb, Workstation, and I doubt it will be much faster. 2X i7 with 64Gb. at ten times the price, used, still not much faster. Worth it for Rendering, better to buy several cheap ones.

Near 25 years, and just as dumb as Windows 3.1 on a 386.

Back then we were supposed to be a few years from running two Workstations, twice 64Mb. and having full 3D Virtual Reality. It never happened.

Desktop Publishing did happen, about ten years after the promise.

Computer Graphics, animation, is happening about twenty years beyond the promised shipping date.

It is still mechanical, frame by frame, but some of it is getting real looking.

It is no where near being able to give a character a script and stage marks. Frame by frame for every expression, lip and jaw movement, body movement.

What I cannot do, program a figure to read a script out loud. Get them to have the proper voice, emotion, expression, lip movement. Improv is out of the question. Animation lacks natural movement.

Video of a body double, real human, a voice actor, real human, are needed. Computer speech lacks most of what humans put behind words, animation lacks the body language, and I am on a site about people who lack that ability.

It can be done, but the computer is not a partner. Processing power, memory, and programing, lack intent, projection, and have to be forced into the next frame. After exhausting obsession making it do what I want, it has no memory of that. Computers have no ability to learn. Graft a skin on a body double, it can follow all movement. It cannot do the next move as the body would.

I do have a program and a plan for world conquest, but the program is like the chess program. It only knows how to not lose. It does not take a huge computer to run it. The computer would know as much of what is going on as a pickup truck knows about its bed load. The program will just be following orders, and will have no knowledge of my intent.

Our best projections, Star Trek, we got Kirk's phone, the tablet, but the computer was treated like a light switch. Commander Data held Rank, the computer did not even have a name. The computer was not credited with the invention of anything. No ship system was left up to the computer.

In 2001, HAL sings Daisy, Daisy, give your answer true.... IBM was proud when they taught a computer to sing that song badly. If asked, HAL would not know what a Daisy was, a female name or flower. HAL would not know the emotional content of the song. A human mating ritual involving a bicycle built for two.

We could build Data, but he would always be an aspie. He never got humor.

I have intent, and project, I am a thousand times more likely to take over.



Humanaut
Veteran
Veteran

User avatar

Joined: 17 Jul 2014
Age: 53
Gender: Male
Posts: 4,390
Location: Norway

17 Feb 2015, 9:59 am

Narrator wrote:
I think processing power is starting to plateau, making AI sentience less likely.

The notion of processing power being the key is old and outdated. Current thinking seems to center around structure, but human action is goal driven based on biological needs within a specific environment. This is a much more complex matter than many seem to realize. I don't think the so-called sigularity in the form of a superhuman intelligence will see the light of day anytime soon, if ever.



kraftiekortie
Veteran
Veteran

Joined: 4 Feb 2014
Gender: Male
Posts: 87,510
Location: Queens, NYC

17 Feb 2015, 10:19 am

I think humankind, with all its frailties, will prevail over "artificial intelligence."

All we have to do is shut off the main machine!



Fnord
Veteran
Veteran

User avatar

Joined: 6 May 2008
Age: 66
Gender: Male
Posts: 59,750
Location: Stendec

17 Feb 2015, 10:26 am

That "Main Machine" might just turn out to be a building full of MiBs staring at video screens and monitoring everything that happens on teh Interwebz.


_________________
 
No love for Hamas, Hezbollah, Iranian Leadership, Islamic Jihad, other Islamic terrorist groups, OR their supporters and sympathizers.


naturalplastic
Veteran
Veteran

User avatar

Joined: 26 Aug 2010
Age: 69
Gender: Male
Posts: 33,873
Location: temperate zone

17 Feb 2015, 10:27 am

kraftiekortie wrote:
I think humankind, with all its frailties, will prevail over "artificial intelligence."

All we have to do is shut off the main machine!


Ya mean "all we have to do is pull the plug"!



Fnord
Veteran
Veteran

User avatar

Joined: 6 May 2008
Age: 66
Gender: Male
Posts: 59,750
Location: Stendec

17 Feb 2015, 10:35 am

All we really have to do is go completely "Off the Grid". There is no single plug to pull or machine to shut off. As long as there is a government, and as long as there is an Internet, the Singularity will be a reality.

No, it is not advanced machine intelligence that is the issue, but human connectivity that drives the Singularity.

Yes, the Singularity is already upon us. We have become dependent on the Internet for most of our social interactions and entertainment. Our every word is public, and can be read at the leisure of anyone with the right software or proper security clearance. Then our thoughts and opinions can be manipulated through subtle influences in websites like this. All it takes is a few members who are skilled at subterfuge and rhetoric to turn Aspie opinion in the 'official' direction, and the memes will propagate across multiple platforms and social units.

The Singularity is upon us now!


_________________
 
No love for Hamas, Hezbollah, Iranian Leadership, Islamic Jihad, other Islamic terrorist groups, OR their supporters and sympathizers.


ooOoOoOAnaOoOoOoo
Veteran
Veteran

Joined: 18 Jun 2008
Gender: Female
Posts: 12,265

17 Feb 2015, 11:32 am

kraftiekortie wrote:
I think humankind, with all its frailties, will prevail over "artificial intelligence."

All we have to do is shut off the main machine!

It might seem that simple HOWEVER, all the machines depending on that main one might keep humans from doing that since they will think faster than the speed of light and do super human things, much like the machines of today, like the automobile. So, if they have developed the capacity to think on their own and look out for themselves, they would be well aware they have to protect the main machine at all costs.

Think about it, if machines are running the factory manufacturing more of them, they will be able to replicate with impunity. They could just produce at will. We would have made them to be able to handle all kinds of situations, including looking for new resources to produce more machines. So they would be "birthing" if you want to put it that way, on a constant basis.



Last edited by ooOoOoOAnaOoOoOoo on 17 Feb 2015, 11:36 am, edited 2 times in total.

kraftiekortie
Veteran
Veteran

Joined: 4 Feb 2014
Gender: Male
Posts: 87,510
Location: Queens, NYC

17 Feb 2015, 11:34 am

We'll win because we're more flexible.



ooOoOoOAnaOoOoOoo
Veteran
Veteran

Joined: 18 Jun 2008
Gender: Female
Posts: 12,265

17 Feb 2015, 11:37 am

kraftiekortie wrote:
We'll win because we're more flexible.

What would work to our advantage is machines not getting along and fighting each other...



kraftiekortie
Veteran
Veteran

Joined: 4 Feb 2014
Gender: Male
Posts: 87,510
Location: Queens, NYC

17 Feb 2015, 11:40 am

yep...Divide and Conquer! We have the flexibility to impart that idea onto machines!



alex
Developer
Developer

User avatar

Joined: 13 Jun 2004
Age: 37
Gender: Male
Posts: 10,214
Location: Beverly Hills, CA

17 Feb 2015, 12:26 pm

I recommend reading Ray Kurzweil's books if you haven't. The topic of singularity is certainly fascinating although his ideas about health supplements are a bit far fetched.


_________________
I'm Alex Plank, the founder of Wrong Planet. Follow me (Alex Plank) on Blue Sky: https://bsky.app/profile/alexplank.bsky.social


K_Kelly
Veteran
Veteran

User avatar

Joined: 18 Apr 2014
Age: 32
Gender: Male
Posts: 1,452

17 Feb 2015, 8:57 pm

What is so fascinating about the singularity? The topic doesn't sound appealing to me at all, I don't understand why people are so for it. Can someone explain to me? And if AI wants to kill us humans, why aren't people considering that more?



AspieUtah
Veteran
Veteran

User avatar

Joined: 20 Jun 2014
Age: 61
Gender: Male
Posts: 6,118
Location: Brigham City, Utah

17 Feb 2015, 9:06 pm

EMP AI!


_________________
Diagnosed in 2015 with ASD Level 1 by the University of Utah Health Care Autism Spectrum Disorder Clinic using the ADOS-2 Module 4 assessment instrument [11/30] -- Screened in 2014 with ASD by using the University of Cambridge Autism Research Centre AQ (Adult) [43/50]; EQ-60 for adults [11/80]; FQ [43/135]; SQ (Adult) [130/150] self-reported screening inventories -- Assessed since 1978 with an estimated IQ [≈145] by several clinicians -- Contact on WrongPlanet.net by private message (PM)