AI

Anything goes

Postby Smirks » Thu Dec 06, 2007 4:32 am

zine wrote:this brings up a question can emotion be learned or is it a totally seperate entity from thought?

*waits to buy his ai slave robot*


Emotions are reactions to ones morals, likes, fears, etc. Some form could be "learned" by AI.
~Smirks
"You play hard to get, I play hard to get rid of."
User avatar
Smirks
PUA
 
Posts: 558
Joined: Wed Nov 14, 2007 9:43 pm
Location: Fort Worth, TX

Postby Finesse » Thu Dec 06, 2007 2:18 pm

I dont think any emotion could be learned by AI. There are just too many variables that go into any given emotion. Take happiness for example.

I am happy right now because I am starting to see the results I want in pick up, and things are going decent in my life all across the board from my health to my work to my relationships. Ask me if im still happy after I talk to my ex..... or ask me if im still happy after I start classes. It could change. And the examples I listed are just a FEW of the things that could be controlling my happiness at the moment.

But happiness is relative to whether or not a person at that point is being optomistic or pessimistic. A person could invariably be on their death bed and still be happy because they are "going home". You just never know what situations would make a person happy or sad or mad or anything.

Emotions do take into account ones morals, but there are also other things that effect it. For instance, I would never kill anyone.... but if my baby mama were to die.... GREAT! That would solve alot of my unhappiness at the moment. It is a moral faux paux to want someone to die, however when that person is the cause of most of the unrest in your life...... which outweighs the other?
Finesse
PUG
 
Posts: 1025
Joined: Thu Apr 26, 2007 11:45 pm

Postby Westfall » Thu Dec 06, 2007 3:00 pm

Vector wrote:As long as we're off topic, there does exist right and wrong, but only with respect to values. Values are entirely personal and subjective. If something supports your values, then it is right, if it opposes your values, then it is wrong. The laws of cause and effect are objective, and whether an action supports or opposes your values is a matter of objective reality (though it may not be obvious which it is).

Any claim of good or bad or right or wrong presumes a value and contains a claim about reality.

Lots of people will try to tell you (or imply) that there is only one "true" set of values. I don't believe them.


I agree with that. Have you read any Ayn Rand?

Finesse wrote:I dont think any emotion could be learned by AI. There are just too many variables that go into any given emotion. Take happiness for example.


The real question here is what defines a mind? Can conciousness arise from non-biological forms? The evidence I've seen leads me to believe it can be. Making a synthetic digital human brain might be a enormous task, but as computing power expands, it seems only reasonable that we will be able to reverse engineer a mind indistnguisable from that of a human.

Take Tic Tac Toe for example, it is a game that can be broken down into a tree of possilbities. If I make the first move, there is a set number of moves my oponent can make, each one being a branch in the tree of possible outcomes.

http://en.wikipedia.org/wiki/Image:Tic- ... e-tree.svg

Chess, is a much more complicated game that has a much much larger game tree. As such, it would be possible to write a program that would never lose (given enough computing power). Though Chess would have a number of game tree branches beyond a google

I imagine existance could be defined similarally with a giant cone of possiblities that is ever-chaging based on new information. That'd be cool. Damn, I got a little side tracked there.

Westfall
Katie wrote:i want some count chocula right now

Kit wrote:Westfall, you're being a dick.
User avatar
Westfall
PUA
 
Posts: 796
Joined: Fri Oct 20, 2006 5:20 pm
Location: Sexarkana

Postby Vector » Thu Dec 06, 2007 3:32 pm

Westfall wrote:
Vector wrote:There does exist right and wrong, but only with respect to values. Values are entirely personal and subjective. If something supports your values, then it is right, if it opposes your values, then it is wrong. The laws of cause and effect are objective, and whether an action supports or opposes your values is a matter of objective reality (though it may not be obvious which it is)...


I agree with that. Have you read any Ayn Rand?

I've got pretty much her entire library and I've read it all. Except I only got halfway through We The Living. I guess I got bored with it.

What I wrote above is essentially a paraphrase of the foundation of her ethics.
[size=75]I'M OUT OF THE HOUSE AND I'VE GOT MY GOGGLES ON! ONWARD TO SEX LOCATION!
Vector
PUA
 
Posts: 596
Joined: Thu Nov 02, 2006 2:34 am
Location: Richardson, TX

Postby 101998 » Thu Dec 06, 2007 4:23 pm

Again, why would you want to engineer something with emotion? What would be the benefit? As great as our brains are, there are much more efficient models for mass data analysis and problem solving (which is why you need AI). You’d have to engineer the AI to be happy when it solves a problem, and unhappy it is screws something up, then you have to program how it reacts in those states. We developed emotions on evolutionary incentive, AI wouldn’t need that because survival/procreation wouldn’t even come into it’s realm of thinking. It would be like taking a career blue-collar male and putting him in a Geisha school, completely pointless.
Ignorance more frequently begets confidence than does knowledge. It is those who know little, and not those who know much, who so positively assert this or that; and that problem will never be solved by science.

- Charles Darwin
User avatar
101998
rAFC
 
Posts: 38
Joined: Sun Jul 22, 2007 7:15 am
Location: Denial

Postby Finesse » Thu Dec 06, 2007 4:28 pm

101998 wrote:Again, why would you want to engineer something with emotion? What would be the benefit? As great as our brains are, there are much more efficient models for mass data analysis and problem solving (which is why you need AI). You’d have to engineer the AI to be happy when it solves a problem, and unhappy it is screws something up, then you have to program how it reacts in those states. We developed emotions on evolutionary incentive, AI wouldn’t need that because survival/procreation wouldn’t even come into it’s realm of thinking. It would be like taking a career blue-collar male and putting him in a Geisha school, completely pointless.


My thoughts exactly.
Finesse
PUG
 
Posts: 1025
Joined: Thu Apr 26, 2007 11:45 pm

Postby Westfall » Thu Dec 06, 2007 5:05 pm

101998 wrote: We developed emotions on evolutionary incentive, AI wouldn’t need that because survival/procreation wouldn’t even come into it’s realm of thinking.


But if it did...if someone did make these thoughts important to the AI....well, we'd have a singuliarity on our hands.

Also, It would be nice to be able to upload one's concense into a digital form. I need to make a backup of myself incase I get eaten by a grue.

Westfall
Katie wrote:i want some count chocula right now

Kit wrote:Westfall, you're being a dick.
User avatar
Westfall
PUA
 
Posts: 796
Joined: Fri Oct 20, 2006 5:20 pm
Location: Sexarkana

Previous

Return to Off Topic

Who is online

Users browsing this forum: No registered users and 14 guests

phpJobScheduler