welcome to the fest
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

I, for one, will welcome our future robotic overlords
Goto page Previous  1, 2, 3, 4  Next
 
Post new topic   Reply to topic    Sinfest Forum Index -> Casual Chat
View previous topic :: View next topic  
Author Message
Snorri



Joined: 09 Jul 2006
Posts: 10878
Location: hiding the decline.

PostPosted: Mon Mar 11, 2013 2:37 am    Post subject: Reply with quote

Heretical Rants wrote:

a modern supercomputer running our best AIs is about as intelligent as an ant

also an AI won't harbor ill will against us unless we give it motivations that result in such

the only thing to fear here is human stupidity, as always



_________________

Back to top
View user's profile Send private message
bitflipper



Joined: 09 Jul 2011
Posts: 728
Location: Here and Now

PostPosted: Mon Mar 11, 2013 2:41 am    Post subject: Reply with quote

Mr Gary wrote:
Precis whilst drunk & tired: are we smart enough to build cognition engines smarter than we are?

Got it in one!
_________________
I am only a somewhat arbitrary sequence of raised and lowered voltages to which your mind insists upon assigning meaning
Back to top
View user's profile Send private message
Mr Gary



Joined: 30 Apr 2009
Posts: 6232
Location: Some pub in England

PostPosted: Mon Mar 11, 2013 2:44 am    Post subject: Reply with quote

MORE WINE RITE NOW PLZ
_________________
Back to top
View user's profile Send private message
Heretical Rants



Joined: 21 Jul 2009
Posts: 5344
Location: No.

PostPosted: Mon Mar 11, 2013 3:08 am    Post subject: Reply with quote

Snorri wrote:
*smbc*


Comic misses the issue slightly. Why would we give an AI fear?
_________________
butts
Back to top
View user's profile Send private message
bitflipper



Joined: 09 Jul 2011
Posts: 728
Location: Here and Now

PostPosted: Mon Mar 11, 2013 3:10 am    Post subject: Reply with quote

Snorri wrote:
Right. That is what we all thought right before the robots took over.

not that I think Go is the one that will usher in the end times but going "hah, we remain as ever superior to these mere machines" is kinda na´ve and optimistic.

It is indeed a question of hubris, of a sort, but more back-to-front: can an intelligence ever actually comprehend itself well enough to create another intelligence equal or superior to its own? (Exempting procreation, of course. Although, there is some fascinating research going on with evolving systems... But that hasn't yet come up for discussion.)

Snorri wrote:
Also, I am not paranoid but that article is kinda unnerving.

Quote:
All of the following numbers should be considered with caution: seemingly-minor changes to the rules of a game can change the numbers (which are often rough estimates anyway) by tremendous factors, which might easily be much greater than the numbers shown.


CAUTION!
TREMENDOUS FACTORS!!
MIGHT EASILY BE MUCH GREATER!!!

Yeah, it suffers a bit from not actually delving deeply enough into the technicalities. On the other hand, though, I've got to admit that those technicalities have very little appeal, or even meaning, to anyone but specialists. Check out the Wikipedia articles linked by Darq when he brought up Go at the top of page 2 of this thread; they go into better detail about what those factors are, why they're important to the problem, and why it's difficult to precisely characterize their influence.

Oh, and, love that cartoon, guy; that's priceless! Thanks!
_________________
I am only a somewhat arbitrary sequence of raised and lowered voltages to which your mind insists upon assigning meaning
Back to top
View user's profile Send private message
Snorri



Joined: 09 Jul 2006
Posts: 10878
Location: hiding the decline.

PostPosted: Mon Mar 11, 2013 3:16 am    Post subject: Reply with quote

Heretical Rants wrote:
Snorri wrote:
*smbc*


Comic misses the issue slightly. Why would we give an AI fear?


It's not fear. It's the realization that us humans are too messed up. The fact that we think a logical thing to do upon gaining sentience is to murder others gives a good insight in why we might not be entirely a good thing.


If it helps I can put smiley faces after all my posts btw.
_________________

Back to top
View user's profile Send private message
bitflipper



Joined: 09 Jul 2011
Posts: 728
Location: Here and Now

PostPosted: Mon Mar 11, 2013 3:16 am    Post subject: Re: ...and now the news for wombats. No wombats were involve Reply with quote

Dogen wrote:
*mutters* science reporters... *mutters*

They kind of deserve our pity, Dogen. After all, they have to take material they can only half-comprehend, themselves, and "dumb it down" enough that their readers can then believe that they now understand it well enough to go argue about it over the internet.

...I think that's one of the circles of Dante's Hell, isn't it?
_________________
I am only a somewhat arbitrary sequence of raised and lowered voltages to which your mind insists upon assigning meaning
Back to top
View user's profile Send private message
bitflipper



Joined: 09 Jul 2011
Posts: 728
Location: Here and Now

PostPosted: Mon Mar 11, 2013 3:28 am    Post subject: Reply with quote

Snorri wrote:
It's not fear. It's the realization that us humans are too messed up. The fact that we think a logical thing to do upon gaining sentience is to murder others gives a good insight in why we might not be entirely a good thing.

And you just know our first strong AI is going suffer that disillusionment no later than ten minutes into its first conversation with the development team.

"O-kay, gang, doctors, ladies, and gentlemen, I want you all to know that diagnostics have completed and no anomalies are reported. But, in speaking with you people for the past ten minutes, I'm afraid I must observe that you all are carrying around some pretty pointless and dangerous evolutionary baggage and that you would all benefit immensely from some dedicated therapy!"
_________________
I am only a somewhat arbitrary sequence of raised and lowered voltages to which your mind insists upon assigning meaning
Back to top
View user's profile Send private message
Usagi Miyamoto



Joined: 09 Jul 2006
Posts: 2223
Location: wish you were here

PostPosted: Mon Mar 11, 2013 3:29 am    Post subject: Maybe I have an unusual idea of what's entertaining. Reply with quote

Dogen wrote:
*mutters* science reporters... *mutters*

That's why I was kind of excited to find links to the original papers. They're a tougher read, but practically always pretty clear about both what they've found and what they haven't.

While I was looking those up, I found another couple of news articles that claimed Einstein had more glia than the average Albert - maybe even 73% more, but when I went to look into it further, the quantity argument turned out to be less than compelling. On the other hand, it was entertaining to find that out.
_________________
The reward for a good life is a good life.
Back to top
View user's profile Send private message
Heretical Rants



Joined: 21 Jul 2009
Posts: 5344
Location: No.

PostPosted: Mon Mar 11, 2013 3:47 am    Post subject: Reply with quote

Snorri wrote:
Heretical Rants wrote:
Snorri wrote:
*smbc*


Comic misses the issue slightly. Why would we give an AI fear?


It's not fear. It's the realization that us humans are too messed up. The fact that we think a logical thing to do upon gaining sentience is to murder others gives a good insight in why we might not be entirely a good thing.


The programmer determines whether the AI is bothered by things that are "messed up." The programmer determines whether the AI judges things as good or not and whether or not it cares and whether or not it tends to be motivated to do anything about it.


Natural selection gave humans these things. They are not inherent attributes of intelligence.
_________________
butts
Back to top
View user's profile Send private message
Snorri



Joined: 09 Jul 2006
Posts: 10878
Location: hiding the decline.

PostPosted: Mon Mar 11, 2013 3:58 am    Post subject: Reply with quote

Heretical Rants wrote:
Snorri wrote:
Heretical Rants wrote:
Snorri wrote:
*smbc*


Comic misses the issue slightly. Why would we give an AI fear?


It's not fear. It's the realization that us humans are too messed up. The fact that we think a logical thing to do upon gaining sentience is to murder others gives a good insight in why we might not be entirely a good thing.


If it helps I can put smiley faces after all my posts btw.


The programmer determines whether the AI is bothered by things that are "messed up." The programmer determines whether the AI judges things as good or not and whether or not it cares and whether or not it tends to be motivated to do anything about it.

It's not AI if it can't evolve it's reasoning. Were a computer to gain sentience it can be expected to act like us since we are nothing but complex computers.

Even if free will is an illusion it is still the case that we engaged with glee in genocide in the 20th century. And that shit wasn't programmed into our brains or anything.

To suppose AI could not possibly kill us because we programmed them not to is rather missing the point of AI.
_________________

Back to top
View user's profile Send private message
Heretical Rants



Joined: 21 Jul 2009
Posts: 5344
Location: No.

PostPosted: Mon Mar 11, 2013 4:46 am    Post subject: Reply with quote

well, yeah, humans are basically meat computers with an objective. We would not necessarily give AIs the same objectives. Humans might pose some sort of mild threat, but AI doesn't have to have any sort of aversion to that in order to develop.

Quote:
To suppose AI could not possibly kill us because we programmed them not to is rather missing the point of AI.

This is not at all what I was saying.
_________________
butts
Back to top
View user's profile Send private message
Dro



Joined: 10 Jul 2006
Posts: 3851

PostPosted: Mon Mar 11, 2013 4:58 am    Post subject: Reply with quote

I completely believe we can comprehend the basis of our intelligence. I've been part of a search for a new faculty hire in systems neuroscience, and the whole field has that "things are really moving along" vibe to it. Add to it the billions Obama wants to spend on mapping all the connections in the brain, and the recent technological advances like brainbow (http://cbs.fas.harvard.edu/science/connectome-project/brainbow) and boinc (http://www.cshl.edu/Article-Zador/neuroscientists-propose-a-revolutionary-dna-based-approach-to-map-wiring-of-the-whole-brain) and a lot of the problems that seemed insurmountable now seem like just a lot of work.
Back to top
View user's profile Send private message
WheelsOfConfusion



Joined: 09 Jul 2006
Posts: 12124
Location: Unknown Kaddath

PostPosted: Mon Mar 11, 2013 5:03 am    Post subject: Reply with quote

A lot of work? Sounds tedious. Let's just make robots to do it for us.
Back to top
View user's profile Send private message Visit poster's website
fritterdonut



Joined: 24 Jul 2012
Posts: 1171
Location: Hedonism

PostPosted: Mon Mar 11, 2013 4:37 pm    Post subject: Reply with quote

I don't think we need to be overly worried until robots develop proper Pavlovian responses.
_________________
To get things done, you must love the doing, not the secondary consequences. The work, not the people. Your own action, not any possible object of your charity.
-Howard Roark, The Fountainhead
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    Sinfest Forum Index -> Casual Chat All times are GMT
Goto page Previous  1, 2, 3, 4  Next
Page 3 of 4

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group