?

Log in

No account? Create an account
 
 
06 September 2013 @ 11:35 am
Any and all upgrade kits left on my doorstep will be accepted and implemented, no questions asked  
I find that I have put sentience into a special, sacred realm, where I consider it to be inherently valuable. This value is intimately linked in my mind to something like autonomy, and the value I place on these two features seems to support certain of my transhumanist desires, but also makes me a little bit picky about which possible scenarios involving greater-than-human intelligences I prefer or find abhorrent.

Note: For the purposes of this post, let's not argue what I define as conscious or sentient. I’m not sure this is that well thought-out, it’s just a reaction I’ve noticed in myself and am still exploring.

I'm not sure if having sentience be something inherently special is acceptable under the rubric that one must choose something to draw the line at, or if it’s just as arbitrary and dangerous as any other realm held sacred. Once something is conscious, I grant it some level of inherent and inalienable value and right to be autonomous and/or respected/valued as worthwhile in and of itself. I don’t expect human beings, for example, to earn their keep - ideally, everyone "deserves" a satisfying life with their needs met; eudaimonia and physical comfort don't need to be "earned". I recognize this is not necessarily practically achievable, and have not worked out a response to that fact, but let's forget about it for now; all it means is that I am uncomfortable with many sad facts about the way the world is. And I balk at devaluing the life of someone else conscious just because I find it to be unpalatable: I'm generally willing to give them credit for wanting to be alive, even if I think it would be better not to be than to live the way they do. With currently existing sentiences – i.e. humans – I want the best for them, for them to continue to flourish, to attain autonomy or avoid oppression, etc. A big part of what seems most ethically acceptable to me is for an intelligent being to have some measure of control over its own existence, which includes its own cognition.

However the current state of affairs generally does not really live up to my standards, and so I recognize the inherent tragedy of human existence. I guess I really do believe that our limitations are a tragedy.

Recently a book I read had a coda at the end, after most of mankind had died away, where one of the female characters was happily and proudly pregnant, rebuilding the human race. This sort of post-apocalyptic scenario, where it's just a given that women need to devote themselves to childbearing, purely due to facts of biology, especially when it's sort of painted as something that they are happy or proud to do, stresses me out. Maybe part of it is that so many aspects of that attitude are not, actually, science fiction. It may not be painted as vital for the survival of the only intelligent race we know of in most cultures, but it is a strong force; and for people who feel like there are 'right' and 'wrong' ways to be, or anything along those lines, there is an argument that 'right' people should be having babies - i.e. smart, educated, non-religious women need to breed to outcompete poor, ignorant, and religious people who are flooding the planet with their large families. Whether or not any of this has any validity currently or in the past, the possibility that there would be a situation - say the post-apocalyptic one - where it may be valid, or where social necessity is such that they may as well be valid as far as I, as a woman am concerned, alarms me. I imagine myself in that place and think, I don’t want it. Yes, we all have to live ways we don’t want to, certainly in such a hypothetical extreme situation, but I still find it disturbing. We could make the argument of it being one-sided: maybe a man would say, 'no of course I wouldn't be ok with it suddenly becoming imperative that I devote my life and body to making children, I certainly wouldn't take pride in this sort of 'contribution', but hey, that's different, that's just the way things are.' Of course men have their own unavoidable burdens and expectations, but I wouldn’t want them to be stuck with those, either.

This whole act of accepting how things just fundamentally are or must be, accepting the destiny of our biology, sometimes disgusts or even frightens me. Some things about gender that are less extreme still feel that way to me. To this scenario in particular, but probably adaptable to others, it makes it feel urgent to close the gap between ourselves and technology as much, and as soon, as possible. I don’t want to be able to lose the advances we've already made merely because something has gone wrong. I want the power to do as I please to be an inherent part of me that will not disappear with the collapse of a complex and tenuous infrastructure, with anything going even moderately out of its intricate order. I don’t want the knowledge and achievements of my civilization to remain external, separable, losable. I want them to be a part of me, I want my very bodily existence to contain the tools and information I require. My biology is inadequate and damning and it is important that it change. I am deeply unsatisfied with common visions of the future that postulate essentially anatomically human beings with more and more advanced devices. What are people thinking, assuming we will be so much like ourselves, just with accessories? I don’t want accessories, I want, myself, to be different, to be advanced.

People speak of humans in the future, worry about humans’ relationships with machines. There is a possible dystopia with human-machine relationships, and what I want is to eliminate the divide. Seeing all these problems with these unequal beings: subservience, oppression, the corralling of sentient beings by others – I don’t know that I argue for there being a monopoly on sentience for one type of creature, as there currently is, but significant gaps in power or ability disturb me. It pertains to my discomfort with power and control, which itself stems from my assumption against being able to make accurate predictions about complicated systems, particularly those involving intelligent beings. Maybe these beliefs will change as my notion of what is possible changes; as it stands, I suppose I doubt that any entity will be sufficiently intelligent to be able to make accurate or reliable enough predictions to be trusted to control people or their situation safely. I believe this to be the case for all humans currently, and probably all currently existing systems, at least for large enough contexts. Humans and computers can be reasonably trusted to make controlling decisions in small enough contexts, (for example, maybe single-instance medical cases for humans and/or computers, and controlling flight patterns or something for computers), but on larger scales, such as whole societies or governments, I don’t think anything has that power and knowledge right now.

If we could suppose a wise enough AI, however, that could engineer society for the benefit of humans (or other intelligent but less-intelligent-than-the-AI entities), a new nervousness seeps in, related to my fundamental value of the autonomy of sentient beings. Something about being intelligent and self-aware, yet acknowledging in all areas the superior choice of another, creeps me out. If we are not smart enough to even hope to verify to our satisfaction the veracity of the wise entity’s proclamations, it seems to put us in a position of complete dependence, reliance, and subservience that jars with my notion of sentient autonomy. I can accept deferring to experts in general, but once it becomes so universal that there is no method of investigating the claims or reliability of these experts it becomes disturbing to my sensibilities. I may not be able to understand the full extent of a human expert’s claims, but we are close enough in knowledge and intelligence on the whole possible spectrum that, were I to take some time and effort, I am at least capable of noticing if something seems fishy, or if the claim corresponds with other information I understand better, or otherwise find reasonable clues as to the reliability of the claims – they are better than any I could make, but not so grandiosely better that it is always in my best interest to accept the claims of any expert without further thought. Once the expert’s knowledge and intelligence is so far out of my own range, however, I become helpless. It basically never makes sense for me to do anything but believe it unconditionally. Which is fine, given that of course it is going to be right so much more reliably and thoroughly than any current human expert, but what does that do for my existence as a supposedly intelligent being?

It may be that I have not thought this through thoroughly enough – this is something that already occurs in many, many areas, which I don’t find repulsive. It may be that the idea becomes offensive once there is absolutely no area of my life whatsoever in which I have the need to exercise my own intelligence capabilities. But what if there was just one, small, limited area? That doesn’t seem that great either. I’m not sure how to quantify what I find acceptable, or why exactly. It certainly seems important to have access to the correct information, to wise and reliable advice. But to put an intelligent or sentient being in the position of never being able to discern its own opinion, evaluate its own data, ever, seems somehow unethical to me, if only because of my intuitive discomfort with the thought, if only because I have placed this special importance on the autonomy of sentient beings.

So I am reluctant to support further creation of such limited beings as we are. I value many things about human existence; some of these I suspect may just be rationalizing or making do with what I have, and some are more universal. These latter seem to be the ones I want to support, maintain, and create; the former I suppose I do love currently but am willing to see go extinct. And so I do not find the notion of humans, as current humans, disappearing to be replaced by some other, superior race to be disturbing; in fact I find the notion of us continuing as we currently are, either alone or coextant with superior beings, to be quite repugnant.