zine wrote:this brings up a question can emotion be learned or is it a totally seperate entity from thought?
*waits to buy his ai slave robot*
Emotions are reactions to ones morals, likes, fears, etc. Some form could be "learned" by AI.
zine wrote:this brings up a question can emotion be learned or is it a totally seperate entity from thought?
*waits to buy his ai slave robot*
Vector wrote:As long as we're off topic, there does exist right and wrong, but only with respect to values. Values are entirely personal and subjective. If something supports your values, then it is right, if it opposes your values, then it is wrong. The laws of cause and effect are objective, and whether an action supports or opposes your values is a matter of objective reality (though it may not be obvious which it is).
Any claim of good or bad or right or wrong presumes a value and contains a claim about reality.
Lots of people will try to tell you (or imply) that there is only one "true" set of values. I don't believe them.
Finesse wrote:I dont think any emotion could be learned by AI. There are just too many variables that go into any given emotion. Take happiness for example.
Katie wrote:i want some count chocula right now
Kit wrote:Westfall, you're being a dick.
Westfall wrote:Vector wrote:There does exist right and wrong, but only with respect to values. Values are entirely personal and subjective. If something supports your values, then it is right, if it opposes your values, then it is wrong. The laws of cause and effect are objective, and whether an action supports or opposes your values is a matter of objective reality (though it may not be obvious which it is)...
I agree with that. Have you read any Ayn Rand?
101998 wrote:Again, why would you want to engineer something with emotion? What would be the benefit? As great as our brains are, there are much more efficient models for mass data analysis and problem solving (which is why you need AI). You’d have to engineer the AI to be happy when it solves a problem, and unhappy it is screws something up, then you have to program how it reacts in those states. We developed emotions on evolutionary incentive, AI wouldn’t need that because survival/procreation wouldn’t even come into it’s realm of thinking. It would be like taking a career blue-collar male and putting him in a Geisha school, completely pointless.
101998 wrote: We developed emotions on evolutionary incentive, AI wouldn’t need that because survival/procreation wouldn’t even come into it’s realm of thinking.
Katie wrote:i want some count chocula right now
Kit wrote:Westfall, you're being a dick.
Users browsing this forum: No registered users and 9 guests