tag:blogger.com,1999:blog-7408018055543126311.post2761557932561701960..comments2018-01-06T15:47:39.123+00:00Comments on The Intrepid Mind: Can robots ever feel pain?Smighttp://www.blogger.com/profile/09154430980898809228noreply@blogger.comBlogger8125tag:blogger.com,1999:blog-7408018055543126311.post-47341395175637897862012-02-10T17:48:57.847+00:002012-02-10T17:48:57.847+00:00From a experimental point of view we can start wit...From a experimental point of view we can start with modelling Melzac and Walls Gate Control theory possibly based Britton and Skevington model or Prince et al modelScott Turnerhttps://www.blogger.com/profile/17619817819516757194noreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-471774048827305422011-01-03T19:26:38.690+00:002011-01-03T19:26:38.690+00:00I guess empirical knowledge cannot get us anywhere...I guess empirical knowledge cannot get us anywhere. Neither will any amount of systematic logical argument(even though one may enter into the most esoteric of topics). We have to understand that we as human beings are very, very, very, very limited. We have imperfect senses (many other creatures on earth are equipped with better senses than us). We have limited intelligence. The instruments or aids we manufacture using our senses also will be imperfect. We just will not be able to understand the real nature of the whole universe, and the purpose of our existence.<br /><br />Can we by deep speculation ourselves be able to understand what is consciousness and the origin of consciousness? <br /><br />Or we would have to turn to a very reliable, authentic source of knowledge, a perfect authority on these topics. We all are conscious, but exactly what is consciousness? <br /><br />Suppose I and a robot are given a book to read. <br />What happens when a person reads a book? When a person reads, he becomes aware of various thoughts and ideas corresponding to higher-order abstract properties of the arrangement of ink on the pages. Yet none of these abstract properties actually exists in the book itself, nor would we imagine that the book is conscious of what it records. I may find the content interesting/boring/thrilling/amusing/horrifying whatever... I may enjoy reading or I may dislike what I read. Will a robot who scans each and every letter, word, or sentence ever be able to experience the book the way i did?<br /><br />We have to find out whether there is an absolute authority who knows (or an absolute standard by which we can say) whether the colour you see (or the pain you feel when pinched hard) and the colour I see (or the pain I feel) is one and the same or not.<br /><br />Is there such an authority?<br /><br />MayurvgAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-84480150448650612282010-10-13T15:09:33.996+01:002010-10-13T15:09:33.996+01:00I liked the post. but, here the critics would say ...I liked the post. but, here the critics would say that this argument is circular.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-19656041738461828582010-10-13T00:56:56.754+01:002010-10-13T00:56:56.754+01:00If (and it's a big if) we come from the assump...If (and it's a big if) we come from the assumption that such a thing is possible, I don't think we'd ever know if the robot were feeling pain or what were it like for the robot to feel pain. Much in the same way we don't know how it is is for a fly to feel pain.Smighttps://www.blogger.com/profile/09154430980898809228noreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-91810103641666897152010-10-09T21:00:07.550+01:002010-10-09T21:00:07.550+01:00Tell me how will it be for an robot to feel pain??...Tell me how will it be for an robot to feel pain???Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-64134850836804837182009-11-17T00:18:33.851+00:002009-11-17T00:18:33.851+00:00But my point was not dependent on an objective, ab...But my point was not dependent on an objective, absolute standard of pain. It suffices to assume that humans have a feeling of pain that reside in the subjective realm of the mind which I find impossible to reduce to some objective physical framework of particles or waves.<br /><br />This connects with the part where you said that "So if a robot has the sensory apparatus to observe and translate a signal, who are we to say that such an interpretation is incorrect?". I think this is the central part of my post and I think you're ignoring the existence of the sensation and concentrating on the behavior alone. How would you define "observing and translating a signal"? Because my calculator does just that and yet I don't think that it can experience any kind of sensation. It's only behavior we're observing. I was trying to show that difference with my robot; simply interpreting a signal from a mechanical pressure sensor and behave in response to it doesn't entail the existence of the sensation that we experience just like a camera that interprets the wavelength of what we'd call a red photon doesn't experience "redness".Smighttps://www.blogger.com/profile/09154430980898809228noreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-50711856384246293232009-11-17T00:17:21.893+00:002009-11-17T00:17:21.893+00:00Hi Eric, thank you for your comment.
You are righ...Hi Eric, thank you for your comment.<br /><br />You are right in pointing out my assumption but I think you're misinterpreting it. I'm indeed assuming that humans really feel pain, it's an extrapolation that can't be verified in anyone but myself and by myself but I think I am reasonable in making it. You seem to concede that towards the end, even in the case of other animals, and you raise a good point in the process: Should we assume that a robot is feeling pain if it behaves as if it's feeling pain, much in the same way that we presume that a bull in a bullfight is feeling pain because it behaves as if it's feeling pain?<br /><br />I think that, if we accept the impossibility of constructing a feeling with objective foundations, as is apparently the case of the robot in my post, then the analogy loses its meaning in trying to prove that such a robot would actually be experiencing pain. However, a different problem could arise in reverse. Why do we assume that the bull feels pain if we're assuming that the robot couldn't? And even, why assume that other humans feel pain? However, the animal's behavior is not the only cause of our assumption, my reasoning for thinking that the bull actually experiences pain is based on the fact that I feel pain, coupled with my beliefs about what the bull and I share in our origins, which gives me no reason to think that I'm special. I'm then left with no good reason to think that other animals don't experience pain given the fact that they behave as if they did. This doesn't hold true for the robot as the nature of its existence is of a different kind.<br /><br />Can these apparently incoherent beliefs be consistent with one another? Intuitively, it seems that's unlikely, but if we have good reasons to make both assumptions, then they must be consistent.<br /><br />Now, I think you were mainly disagreeing with the idea that pain is the same feeling for everyone, that pain feels the same way for everyone but about that, I haven't argued one way or another. It's still a very interesting discussion though. Is the masochist feeling something different that is not pain or is he merely finding pleasure in the feeling of pain? Is the difference between your feeling of pain in a hot shower and your wife's, a qualitative difference or merely a quantitative difference (desensitization)?<br /><br />One problem with that discussion is that objectifying something that is subjective in its nature doesn't seem possible and it might be the wrong approach when discussing something of that nature. I mean, there's simply no way to know whether we're both seeing the same color when we're talking about red. If I see the color spectrum all reversed (shorter wavelengths for reds and longer wavelengths for violets) we'd never know that we were seeing the world in different colors, as long as we could point to an object and agree on the name of that color. I would still stop at a red traffic light, even though my red was actually your violet. It wouldn't matter. In fact, there might not even be any correct sensation of redness, it can be random as long as it remains coherent in the spectrum.Smighttps://www.blogger.com/profile/09154430980898809228noreply@blogger.comtag:blogger.com,1999:blog-7408018055543126311.post-88769694049855878512009-11-06T23:28:02.508+00:002009-11-06T23:28:02.508+00:00Hello,
I loved this post, but I did find one smal...Hello,<br /><br />I loved this post, but I did find one small flaw in your logic:<br /><br />~You're assuming that humans really feel pain.~<br /><br />At first you may say that assuming otherwise is nonsence, but consider the masochist; this is a person who perceives pain signals as pleasurable. Some consider this to be a neurological disorder, but it is more likley that such behaviour is learned. After all, it is the same nerves that represent both pain and pleasure, but the transition point between the two sensations varies from person to person (for example, I enjoy showers at temperatures that my wife would consider scalding, but when I was younger I couldn't endure such tempuratures so the enjoyment is learned; the sensation is still the same (with the exception of a small loss in sensitivity), but most of all it is the reaction to the sensation that has changed). <br /><br />Therefore, pain may be nothing more than learned behaviour. After all, it is my personal experience that the rougher the parents, the less sensitive the child is to pain. I had a Muay Thai trainer that used to always say, "Pain is just a signal" and it is my experience that this is true and that we can train ourselves to overcome such sensations (some more than others depending on how engraved the notion of pain is in the individual). <br /><br />So if a robot has the sensory apparatus to observe and translate a signal, who are we to say that such an interpretation is incorrect? It's worth noting here that this is exactly how scientists used to justify animal cruely by saying that the animal only emulates an emotional response, but cannot actually feel it (this notion stemmed from the idea that humans were special creations of God and therefore were the only beings capable of true feelings). <br /><br />You said it, "what we feel is in the subjective realm of our mind." By this very definition, if a robot says it can feel pain and if we have no evidence to suggest otherwise then we must assume (at least until a deeper understanding can be realized) that it is telling the truth. Anything less would be... inhuman.<br /><br />Eric Patton from Woodland CaliforniaAnonymousnoreply@blogger.com