(Yet, yes, the evanescence of text within social networks concerns me.)
Whenever I use the term 'technology' I try to use it precisely, as distinct from 'science,' as distinct from 'marketing' (which itself is, by my definition, a technology). The human species has been producing food by technological means for thousands of years: a plough is as much a technological thing as a DNA sequence artificially inserted into another organism.
The USA has traveled further down the GM track than we have and the impression one gets is of pretty large-scale use and less public concern than here. (Perhaps this is because farming is more 'industrialised' there, and more mono-cultural; that is, mono-cultural in the ecological sense of the word.
I agree that there has perhaps been too much of a rush to produce 'economically beneficial' spin offs from genetics. I certainly agree that our behaviour is the problem, not as individuals but as large groups (yesterday's election result demonstrated that quite adequately enough). I also agree that the marketing and monetising of technology is, by and large, questionable verging on the reprehensible.
We have quite possibly passed the once-avoidable point of no return regarding the damage we have done, as a species, to this world. (Or, at least, to the propitiousness of this world for human use.) We can't choose not to factor that in and, given that the stakes are so high, we should be looking at any and all means of, at least partially, patching up what we and our consumerist civilisation has done. If we do not do this we choose to continue, through inaction, to harm the ecosystems of this world. We fail in our collective responsibilities, badly. And if we continue to drive species to extinction so inhumanely, then the only hope of undoing what we've done lies with technology derived from the science of genetics: that is, with the storage of physical DNA or the information it encodes, in the hope that our ever-developing understanding of the way in which genetic information scripts life might allow the preservation or restoration of otherwise-doomed species.
I see no moral argument against self-engineering when that becomes a real possibility. We have many instincts and impulses which have become counterproductive outside of a hunter/gatherer lifestyle, and it is no longer arguable that these need not be modified in order that we survive, if we wish to survive, that is. The science of the gods is not safe in the hands of individuals who share identical brain wiring with our memory-bound predecessors of a hundred thousand or so years ago: something needs to be done about these tendencies, and it needs to be done urgently. Unrestricted, they lead us inescapably to war, greed, possessiveness, imperialism, bigotry: you name it. [Needs expansion.]
I think the decoding of the syllabary of life is one step towards decoding life's language. I am also hopeful enough to imagine how, once we have decoded it, we might learn to write using it. It is this property of codification of information which intrigues me most about DNA and other self-replicators, because the underlying technology for all other technologies is the most transparent, writing. Writing is something we invented, over a period of time, and is thus a technology (language, of course, is most likely hung on a genetic skeleton and is not an invented thing). Without writing, none of our other technologies would have been possible; as we would be entirely unable to store information and, reliably, retrieve it.
Although we probably had lived without it for around 40 - 100 thousand years, any reversion to that scriptless state is not an option -- as technologies cannot be unlearned -- and, more personally, it's not a state in which I'd like to live for any length of time. There would be no possibility of thinking of the sort I am attempting here, for one thing.
I think it's highly significant that, just when our mastery of the fruits of our own invented codification systems has presented us with seemingly intractable problems, we have been able, and perhaps just in time, to discover that all life is coded by a mechanism, albeit an evolved one, which is so very similar to writing. And we have begun to learn the alphabet, the phonemes, of that script, and have just begun to be able to modify it. To cast further exploration of this serendipitous synchronicity aside now would be like torching the Library at Alexandria before the first parchment had even been stored there. And, I am sure, there were those who would have contemplated such a thing, when the first libraries were built and the first books written, just as there are those now who would contemplate erasure of what is perhaps our most significant achievement before its significance has even been understood.
We need tools to effect change in the self. We had one method, writing, and now there are two, and, surprise surprise, they're related. [--- continuation and extension ---]
Writing should not be abandoned just because Mein Kampf was a written text, or because the Bible, or a whole host of wrong-headed texts have been written: it is not a good idea from a mental health perspective to lobotomise the neurologists and psychologists just because they have, historically, made mistakes which have caused a great deal of individual suffering.
Of course, we could have done something differently in the past and avoided all this. Indeed, we might have done something with the lost opportunities of the past which would have avoided even worse problems than those we presently face, who knows? But, we didn't, and that is all that matters. Despite all the technology and erudition in the world we failed, abysmally, and the time's gone. (One of my many issues with time is that its rapid passing makes us short-sighted.)
I'm worryingly close to being forty nine now, and I've been warning about what we've been doing to the systems of the planet that we arrogantly refer to as 'ours' -- and which are somewhat akin to the systems of the body -- since I was around five years old. Every time someone responded to these warnings with the usual accusations of doomsaying and scaremongering was a personal affront; I take no pleasure in the possibility of being right on this one, and hope I'm wrong.
Long ago I decided I should never have children, but many whom I care about have made other decisions, which is one I admire and envy, as my decision is a cause of continual regret for me. I have no wish to see anyone's kids die in a desert, or hear of such deaths, either. That would be too much for me, so maybe I'm just being selfish.
I hope that makes some kind of sense. I am kind of glad my detachment from human traits is noticeable. I have become somewhat peeved with our traits of late.
I assume that there is some basis for what some call the "survival instinct," and that we hope to survive, or at least leave some form of legacy. I categorically do not think human survival is at all 'important' at an evolutionary level, except when viewed as an extinction event, and I refuse to resort to the proposition that the world requires human control: that kind of control is quite beyond us.
I'm not sure I fathom what you mean about any living thing being able to do without understanding. This seems to be the norm for the bahaviour of most organisms and for most organisms I don't see how it can be called 'foolish' at all. It's just the way things are. I see how it can be called foolish when it's us who are the actors, but 'foolish' is a human term which doesn't readily translate to organisms who do not have the kind of conceptualised, abstracted language we have developed.
Life is a system which runs constrained by strict limits, and human control is neither necessary nor 'desirable' for anyone but humans. The parameters of the Hilbert space which defines the limits of carbon-based life are realities: I fear we have pushed them too far and too fast. And, as the ones who did this and, being possessed of both a moral sense combined with at least a certain level of understanding of the biological and chemical 'whys' which other species seem, so far, to lack, so, I feel it is our responsibility to try to minimise the harm.
There are many different levels of distancing which can be used when evaluating things. At the most abstract, one can take the view that, given enough time, everything dies and is forgotten and nothing actually matters in the scheme of things because there is no "scheme of things." The only certainty is entropy's grey decay: everything that evolves is superceded or extinguished eventually. I can't bear staring down that abyss for any length of time, though. Partly because it hurts and partly because I'd like to think that there may be some worth in our type of intelligence. This determination to avoid the completely nihilistic viewpoint is very human and I can't prevent it, and I don't want to. I would like to believe that 'achievements' such as music and art and literature and yes, science, have some intrinsic worth, although I know that nothing we have ever achieved has benefited any species besides our own. I would like to believe that Asimov was mistaken in his greatest fear, and that a technological society, such as the ones we have created, is not always doomed to destroy itself before it has matured enough to surpass its members' individual limitations. Asimov was smarter than I am, though, and he may well have been right.
Problem is, if that's the case, what's the point in caring at all?
We need to distance ourselves from ourselves in order that we can see ourselves more clearly: it is very difficult to sift the innate in us from the enculturated, precisely because we are so immersed and enmeshed in language, and a writing-based way of seeing.
I actually think that control of pretty much anything important, if vested in human hands, becomes a liability: we are too stupid and we are too corruptible. Those people who desire power should never be entrusted with it, and those who should have power would never want it.
I see one of our biggest problems as being the complete absence of anything against which to compare ourselves and our ways of looking. This is vastly problematic as we cannot curently ask any other intelligent, communicative entity, where we've got it wrong and where we haven't. Instead we have to rely on our assumptions, observations and imaginations, which are an integral part of being human.
If Asimov was wrong then any extraterrestrial intelligences are either beyond our technical ability to detect, or are wisely keeping a discreet distance: this would seem quite understandable. Whatever the case,at present they are conspicuous only by their absence. Assuming this situation persists, we are left with the possibility of 'AI,' which I suspect we'll crack eventually. Problem with that is the fear that such a development could usher in the worst period of moral abuse of other sentient beings by members of our species that world would ever have seen.
Thing about wisdom is that it can usually only be identified through the lenses of hindsight and of humanity: prediction is always problematic