Introspection: The heart of the matter

Cogito, ergo sum (I am thinking, therefore I exist) René Descartes

All this (whatever ThIS is) began as a pet project. That is, what you find here is the result of want to create a local database that could be used for quick-referencing. Like using a highlighter to (mentally note) hard-copy texts, while reading.

Keep in mind that I began this as an after-work; and when online search engines were not as pervasive as they are today with the likes of Google, Bing (among others); nor was it as authoritative as Wikipedia has become (again, among many others). I’m speaking about a period of internet time whensites like Site Crawler, Ask Jeeves and Netscape constituted a part of that evolvingdigital footprint that online search now encompass.

My objective has been far more modest in conception. I sought to provide (local) access to reference sources via my personal network. In the process, as with the Internet and in the process of doing so, it has grown and taken on a life of its own.

[An aside – my way of saying that I’m about to rant]

it annoys me: listening to the general public (newbies) refer to the World Wide Web as The Internet. I’m at the point where I, now, tolerate the misuse of the two. I blame it on the marketing crowd and the public’s, apparent desire to be fed bits and bytes of (morsels) of information. Why? The entire ‘buffet’ is right there at the Their fingertips. Explore with a purpose.

Appealing to the headline-seeking crowds that populate ‘feeds’ like Twitter or Facebook – remember AOL and its walled portal? – how are they or any other social networks different?

Instead of a dial-up modem 14.4Kps running on an Intel 486Mhs or a Pentium Pro connecting to a Bulletin Board services, we now have, always connected, High-speed Fibre connections that can connect you with another person (computer) around the world in nano-seconds, so that you can Zoom (and/or bore) them about your rather ordinary day.

Just think – all that computing power in their hands and the general public is so blase about the devices. Kids with toys. Where’s the MTV crowd today?

I remember the ‘good old days’ when ICONS (Applications/programmes) were invoked via icons situated within the GUI. Icons that, essentially, hid the operating system commands that invoked (scripted) the execution of applications via the clicking of a mouse-pointer. D.O.S or i.O.S command syntax are not for the casual computer user, as any computer-geek will tell you.

As you may surmise, I’m on the techie side of things. I often see irony, while appreciate the ironic humour. The Big Bang Theory, sitcom, thrived on its believability of the show’s characters. Across the board.

[Rant Off]

The maintenance of my pet project has evolved to the point where considerable amounts of time is required for its upkeep – especially since the contents were not originally intended to be dynamic in structure. Then there’s the rest of it – the security, backups, updates.

Then BAM! It’s a web, dummy! Hyperlinks are the keys to the World Wide Web).

My, once side-project, has become a full-time endeavour; it helps that I’m now ‘retired’.

Wait!, that then makes me one of those ‘ole Baby Boomers’ that some among us think ought to be put out to pasture or set adrift on ice floes – like a certain recent ‘retiree‘ should have been. Bite me!

I like my externally attached keyboard, mouse (a precision that touchscreens don’t provide), 32inch flat-screen monitor, all the computing power (Ghz) and storage (Terabytes) that my custom-built computing workhorses provide. Laptops and other mobiles with their tiny screens (seriously, HD on a 7 -10″ screen? There’s a reason why computer visual displays have evolved into what we have today.

Apollo Moon Landing July 1969

Squished keyboards with ‘touch-pads’ don’t move me. Mobile devices are for convenience. Nothing more. Even though they’re capable of massive digital storage and (near-instantaneous) communication with anyone who has access to a corresponding ‘receiving device’ in (almost) any part of the world.

We’re not talking about Dick Tracy’s wrist-watch eadio or Get Smart shoe-phone gadgets here. Selfies, anyone?

If you feel the need to attach an external keyboard, mouse or monitor then you might as well go all the way and get yourself a big-boy, that is my position.

Taking into account the tasks involved with maintaining all the devices connecting locally to the single point-of-access to the Internet [gateway] – Anti-virus, Firewall configurations, user-access, hackers, etc, etc., has made me a Home Network Administrator. Sans expectation of fee. Of course. it’s what I do, in my free time – maintain the utility.

Think about that the next time you go ask your buddy or neighbour to ‘help you out’ with something that’s in their field of work, interest(s) or specialty. Geeks like the buddy who loves fishing, automobiles or electronics, for example. Geeks ‘R Us. A geek’s work is never done. Geek, you say? A badge of honour!

For some, modern computing is on par with public utilities. For other, it’s a passion.

Moving on ….

Reality Tv

Recently watched a Reality-Tv show – Black and White in which the apparent thesis is that ‘if you walk a mile in my shoes then you’d learn to appreciate particular cultural perspective or world view’. Interesting but glossed over the reality that an observer to an event, scientific or not, is, in the final analysis, always makes a subjective report of the matter under study. As such, the objective of the observer is always open to interpretation by second or third party observers. In much the same way that the existence or non-existence of God fragments into a variety of debates posited that amounts to POVs informed by the human experience factor.

That is, any scientific, dispassionate or clinical (objective) attempt to give validity to underlying assumptions or suppositions will fail to convince those who don’t share a similar POV – since their assumptions are subjective. How can this not be so when the objective of the exercise is to prove a notion is clearly subjective to any observer? The question as to whether God exists or not becomes irrelevant to any enquiry since the mere expression of the existence of such a being validates that God does, in fact, exist.

The question, really, should be – does God exist outside the Realm of the Believers?

As an observer of the debate over the question of God, in this particular case, I’m agnostic so attempt to maintain a veneer of objectivity that’s informed by the fence sitting. The agnostic (observer) does have a vested interest in the outcome of the question being debated; and to date the score (proof) reads Believers 1, dis-believers (atheists) 0. The skeptical meter may waver every (once in a while) in either direction but no definitive result has been acquired, to date.

No shock, awe or wow of any significance has been forthcoming from either side of the philosophical fence, so far.

Decidedly Older

Decidedly Older (not necessarily wiser)
A Life Lived Unexamined Is Not One Worth Having Experienced; And Other Such Thoughts….

If wisdom comes with age, then clearly the inference is that Wisdom comes in measurable degrees and is quantifiable – just as aging is Age, after all, is but just one unit of Time. But just how is wisdom or knowledge measured?

The question that comes to mind, along with the notion that Men have lost sight of the need for or utility of technology as they’ve become immersed in the minutae science and its purpose. That is, it’s useful to be able to reference to Time as a measurable unit, in a general sense. But knowing the scientific underpinnings of Man’s notion of Time – borders on the cusp of ludicrous as it relates to day-to-day activities and events. If one’s attempting to locate a distant speck of cosmic dust quad-trillions of light-years away such knowledge is useful or necessary.

But closer to home, finding or hitting a target with any degree of accuracy depends on whether one is using a BB gun or a ballistic missile. Do you need or use a BB gun to bring down an elephant and a shotgun to kill a flea? All answers can become subjective. But if logic suggests alternatives to achieve greater efficiencies or results why is it so often necessary to point out the ‘obvious’ through the use of structured logical reasoning? Does this suggest that the Obvious is not necessarily so; thus making it a logical paradox? That is, you cannot learn that which you already know. So, if something is known or obvious it must be so for a particular reason that’s available to general and/or common knowledge. If it does not then the implication is that specialised knowledge is required to make the knowledge obviousness. Thus, that which is generally unknown remains, generally, unknown – not necessarily unknowable.

All human means of communication posit that there exist a reason for the activity or exercise. That is, it serves a purpose. However, there appear to exist no definitive answer as to what that purpose Is. Or, it can be asserted that many purposes can be attributed to any communication, since answers become dependent upon context: subjective or objective. Historical records are never objective – history’s written by the victors? – since the recounting take on a hue of subjectivity by the account’s recorder POV. There’s fact, fiction, politics and propaganda to contend with in any retelling of past events, which is why eyewitnesses often offer differing views to the same event.

Being of modest means, tastes and aspirations, autobiographical accounts that detail every (almost all) aspect of one’s life seems to be somewhat like vain-glory, conceit, self-importance, self-aggrandisement.

That is, apart from one’s particular POV on any aspect of that life, the question of importance or relevance in the Grand Scheme of live beggars the question: what makes you more particularly special than anyone else … in the microcosm that’s life?

As I said, modest. biographies, auto or otherwise, amount to self portraits that places the focus on the integrity of the observer or recorder of past events.

The Art of Relevance

“Life is not a problem to solve, it is a reality to experience.”

Autobiography. The essential definition of pretentiousness, egoism, arrogance, boastfulness … autobiographical – the self-preening of an egotistical arse.

As I sit here and begin this ‘opus’ of mine – hopefully, appropriately titled, I’m thinking about Peter Gabriel‘s song, In Your Eyes. What does that say about my social anchor? Where do social references, the compass of behaviour originate?

The reality is that I’m actually listening to one of my favourite set of musicians – U2.

Dependent upon either the sacred or profane, nothing that is asserted about the history of Man can ever be complete without answering one of these questions:so, where or when did it all begin?

This is nothing new or radical of thought. Greater historical minds than mine have pondered that question. The lore-keepers and recorders of Time are extensive, generally welll-known and well documented. The Masters have enquired and responded, each in their time. As a student of their works, all I have to add is but mere commentary in the annals of Time’s passage.

Feel free to apply labels as needed. If the eyes are, indeed, windows to the soul then one need to be prepared for a maelstrom. Man is simple; Men are not. History tells us so.

I’ve often thought that auto-biographies belong to either the self-delusional, self-important, pompous and self-aggrandising sort among us. Biographies are far more interesting. And, yet, here I am.

So, choose your weapon and stance carefully. Who is Man (or Who Am I?) is not open to question.

According to one’s choice of Source of Authority (insert your preferred or favourite), that is a question that raises a host of questions about such things as Truth. Validity. Veracity. Bias, Independence and so forth. In the end, what it really amounts to is navel-gazing introspection.

Show me another creature on planet Earth that gazes upon his world, then chooses to create a world of its own. But not only that, ‘within that reality’ it then spends the rest of its existence questioning which of the ‘real or imagined’ worlds is in fact Reality?

I can, for example, imagine that the last thing that goes through a bug’s head – as it smashes into a windshield at highway speed, is its arse. The bug’s as real as the windshield. Does the ‘grand designer’ notice nor care that the bug ‘dies’ (among millions of other ‘bugs’) in such a constant manner?

There are those who will want to interject about determinism, fatalism, fate, destiny …. ad nauseam. But in any imagined ‘verse’ the chronology of creation or ‘coming into being’ is dependent upon one constant – Time, the equaliser.

And, if Time is the determinant of (past and present) events as related or recorded by Lore-Keepers (Historians) then, clearly, auto-biographers have lost sight of the purpose of Lore-Keeping: it’s not about the author. Rather, it should always be about the ‘world around you’. Perception is reality; and that requires documented evidence, unadorned, clinical and as unbiased as is possible.

I can also imagine, and believe in, both the scientific and the magical. Distinguish between Fantasy and Reality. If the world around me is an illusionary creation of my own, what does that then make me? Master of my domain? The Big Bang or Creationism are but skins on the onion that we perceive as reality, Life. Rationalism, for the most part, tend to prevail when confronted with skeptical enquiries.

That, in itself, is not problematic. Iit’s when reductive scepticism is applied to all thing and in all areas of life. The substitution of ‘feelings’ in place of ‘deductive reasoning‘ has led us, as a specie (remember, the rational ones?) to some intense and collective navel-gazing.

Accepting that biographies contain the kernels that may, potentially, become notable reference sources at either the macro or micro levels of human societies, their importance then, at a societal level, is less clear (outside of their cultural contexts). The sacred or the profane are both dependent upon the dispensation of some external actor or factor.

This becomes increasingly clear in this Information Age of the 21st Century world where global information generation and collection has given wings to data-mining – the engine that drives most human activity since the Industrial Revolution of yore. It has been kicked into high-gear as humans discover or develop newer means of generating, gathering, processing, distributing (and monetizing) the aggregated bits and bytes of human activities.

This represents not a social revolution but rather an evolution.

In the beginning was the Word, and the Word was with God, and the Word was God. (Gen 1:1) And the Word became flesh, and dwelt among us, and we saw His glory, glory as of the only begotten from the Father, full of grace and truth. (John 1:14)

And, in other news, scientists, quantum physicists, continue to seek The Answer to the questions that surround the origin or initiation of The Big Bang. Who or what started the first spark?; If all things are indeed relative (as Einstein has mathematically shown) then once the initiator is discovered all else should become crystal clear.

Previously only darkness, null, a Void existed, no god exists. So, from whence did the first spark of light emerge? From what dark hole of nothingness did being and time originate? Is Dark Matter the source of Life?

Does it matter. one iota, in the Grand Scheme (Life) whether we are self-aware, sentient and individual entities. Does any of it mean anything more than a cosmic mote in the eye of the Creator/Initiator?

Reality is not itself a problem – it is what it is.

Problems arise when men attempt to define what reality is to All; and in so doing humans end up constructing alternative-realities that collide with what can be called The Collective Reality. If you can’t always get what you want, then, why not try and try again?

It says right there in the scriptures, … God created Man in his own image and likeness (Genesis 1:27); and I believe in both magic and science – both ‘godly’ sources of power.

Darwin observed, documented and then concluded that life on Earth evolves; that the fittest survives during the process. And, in the grand scheme of Life, Man has found himself sitting atop the Tree-of-Life. The food chain, if you like. God apparently gave some of its creations (by most accounts) dominion over the rest of his Creation. The question that is left to be answered is, what makes us so ‘special’ in the eyes of the creator..

Protagoras (he surely was not the first nor will he be the last) pointed out that ‘all is relative’; the veracity of which has been demonstrated by Einstein’s Theory of Relativity [ E=mc2 ]. Even if you believe in miracles, its sublimeness cannot be denied. Who are the Chosen Ones?

Narratives and Chronicles

Space -The Final Frontier

It was not until my introduction to university-level Humanity courses that I discovered a joy for History and Literature: Man The Social Animal.

The ready availability of the World Wide Web has made it easy to pursue in-depth analysis of events and issues around the globe. The answers to questions were all around. As I recall, I did not particularly enjoy previous history classes. Especially the rote learning of names, dates and places with the objective of getting a passing grade on the subject. All the intriguing and colourful stuff were mostly glossed over and without much context.

The Humanities provide the keys to understanding the Social Animal.

I’ve long been an “information junkie”. Throughout my life, I’ve devoured and consumed the contents of books and newspapers at every every opportunity. I used to haunt public libraries. While reading novels (not all) I’d hightlight passages or make notes in the margins as certain ideas or phrases grab my interest.

Watching televised news is something that I find to be generally dissatisfying. The often shallowness and brevity of the ‘news item’ coverages and presentations seems to like ‘small-talk news’. The type that is suitable to those who populate the twitterverse. None-the-less, I’m also a politics junkie. Not in the partisan way of what now passes for public or political discourse. My interests are more aligned with the social, philosophical and political aspects of the discourses themselves. The science of it all.

The everyday application of political ideals has its place in the social (political) aspects of our social lives – entertainment, for example. But I’m interested in the Machiavellis, the Mark Antonys, the Macbeths or Medicis of 21st. century societies.

It matters not whether you self-label or identify as being a Liberal or Conservative, Democrat or Republican, Marxist, Socialist or Communist. Whether Democratic, Autocratic or Theocratic, individuals who seek and wield political power (even in the case of religions) are all guided by baser instincts and self-interests.

Jacques Rousseau (The Social Contract) addressed the legitimacy of the Authority of the State over the Individual in Civil Society. He prosited ‘The Social Contract’ as the basis upon which all individual self-interests are subsumed to those of the State which are encapsulated in the Social Compact between individuals. In current terms, Individual Rights and Freedoms are enshrined in State Constitutions. The Law is what The State deems as being Legal or a Civil Right.

There is no Opt-In choice. Opting-Out is certainly available to any individual who chooses to exercise their ‘Natural’ right to leave the constitutional confines of Civil group, the state. That is, self-exile. And, as has been clearly articulated by Thomas Hobbes (Leviathan), such individuals can expect that life, at best, outside the social construct of Civil Society would likely consist of “continual fear and danger of violent death; and the life of man, solitary, poor, nasty, brutish and short.”

In short, theoretically – the Individual does not give up his ‘Natural Rights’ as a condition of being an equal member of the collective – The State. He or she merely assents to allow their personal self-interests to be governed by that of the Collective or The State.

John Locke, in considering the degree or extent to which an Individual in Civil Society can expect to freely exercise his ‘natural rights’ within the strictures that will be imposed by the State’s Law or Constitution, addresses the question of Civil Liberties.

Liberalism, or the exercise of civil liberties, as John Stuart Mill pointed out, comes with attached costs. John Locke, also had much to say about liberalism. That is not to say that others have not opined or put forth their thoughts and observations of the human condition in what we (should) all accept as social conditions to modern or civil societies.

The Greek historians and philosophers have left their indelible marks on Western societies. Just as the Marxists, the Leninists, Stalinists, Maoists or Islamists have done in their times. Yet, in spite of the literature and bodies of work that exist on the nature of politics, good and bad, the sciences that are utilised to analyse and evaluate the progress of human history [admittedly, there’s no other type of history!) or evolution – it’s fair to say that today’s public discourses have become mired in navel-gazing subjects such as identity-politics.

Paper Clippings (In The Digital Age)

Man – The Naked Ape

Prior to when it became commonly possible to create and maintain digital records in one’s home (thanks to the advent of the Personal Computer) because of my passion for information and reference sources I used to engage in a lot of clipping out articles or collecting books and magazines – along with my note-making about various subjects of interest.

Without quite realising just how much of a ‘geek’ I was at the time (and before The Big Bang Theory series made being ‘Geekish’ cool, I was seamlessly drawn into the technological revolution that the personal computer portended. After all is said and done, I”m a part of the generation that feasted on the tales and travails of Buck Rogers in The 21st Century. Arthur C. Clarke’s 2001: A Space Odyssey and Expedition To Earth, Isaac Asimov “Foundation” and I Robot,

Robert A. Heinlein Stranger in a Strange Land, or Life and Times of Lazarus Long. Ray Bradbury Fahrenheit 451, The Martian Chronicles, The Illustrated Man, Philip K. Dick Do Androids Dream of Electric Sheep [Blade Runner]. Let’s not forget Jules Verne’s Twenty Thousand Leagues Under the Sea or Journey to the Center of the Earth. Gene Roddenberry’s Star Trek and George Lucas’s Star Wars. Aliens or Steven Spielberg’s FIrst Encounter of The Third Kind or E.T.

Neil Armstrong and Buzz Aldrin or NASA Voyager programme.

By the time the Internet and The World Wide Web had appeared I had been (and still do) pine for Flying-Car, personal ‘transporter’ to zip around the world (see the Jetsons).

All of which is to say, that I’ve embraced the digital shift toward data collection, collation, corrolation and dissemination as it can lead to the ‘centralisation of knowldge’, the Big Brain of current human civilisation. The power of which can be staggeringly impactful on furture human societies. Science-fiction-ish? Yes. Even George Owell was that prescient in his writings.

And so, along the way I’ve come to the realisation that I have a predilection for cataloguing. The result of which is this current and personal ‘web site’ project.

Being a ‘computer geek’ has provided me with the means and ease with which I’ve been able to turn a little side project into an ongoing ‘labour of love’. There is no commercial intent involved. Just, subject-matter-title, search, document and save. The cataloguing or indexing / cross-indexing of documents and subjects is solely subjective on my part.

I use Wikipedia, primarily, as an external source of topic/subject references (assuming a certain degree of neutrality in content versus references to ‘partisian’ specific web sites).

In those instances where ‘non-Wikipedia’ sites/sources are cited, the intent is to indicate that the resultant information has a greater degree of ‘partisian’ bias attached. Or, to state the obvious. The intent is not to be seen as promoting (or minimalising) any particular point-of-view.

I admit that some of the views (or positions on any given subject) may become discernible with analysis; but my intent is never to become the ‘voice’ of others. Toot your own horns, I say. Stake your claim and be ready to defend it, IMO.

I think that the use of ‘library’ to describe this site’s contents is a tad pretentious. Call it my digital scrap-book of this and that that have been accumulated over the years. And, perhaps, of little to no interest to others, outside of a few ‘hardy’ souls among the billions that share this planet.

Librarian of Palanthas, Lorekeeper

Astinus Librarian

Several years ago, waiting around at the airport for a flight to a vacation at a beach resort (browsing around for a novel to take on the flight) I stumbled across the Sci-Fantasy genre of Dungeon and Dragons. At the time, a big fan of science fiction, I also enjoyed an indulgence for fantasy wizardry novels such as The Sword of Shannara. A fair amount has been spent keeping up with the tales and adventures in that particular ‘verse..

The Dragonlance universe (as envisioned by the writing team of Laura and Tracy Hickman) then became a similar self-indulgence..

Like Tasselhoff Furrfoot, I tripped into the realm of dragon lore; and (as someone who had grown up with magical lores that included elves, wizards, warlocks, witches, faeries, gnomes, goblins, Merlin, etc.) it felt like a familiar blanket. kindred spirit. Not the watered-down, sanitised pap that Disney has been spoon feeding the public for several generations.

I started my Dragonlance adventurers smack-dab someplace in the middle of the chronicles – Dragons of Summer Flame. By the time I came to the realisation that I needed to start at the beginning of the Dragonlance ‘Histories’ I was far too deep into the lore to stop (plus, being on beach vacation resort with nothing more compelling to do) myself from being enchanted by the tales of lorekeeper.

A testament is not the same as The Testament
Not being a religious person, mysticism / spiritualism intrigues me at a certain level. Sprinkle in some scepticism and cynicism and I’m likely to say that I am a rationalist (with the tendency to have ‘irrational thoughts and ideas’ on existential or metaphysical matters). Or, if you prefer, the Ying and Yang, the Warp and Woof of Life are all intriguing. Who Am I, is as clear to me as is What Am I and When Am I? Questions that beguiles and drives us all.

Why?

It’s, perhaps, the fundamental question that drives us as humans to profess a belief in an ever-present, omnipotent Supreme Being (GOD) behind the veil of life. It’s similar to that which drives others to believe in tales of magic and mythical faeries, dragons, gnomes, dwarfs, elves, wizards, warlocks? All seek to establish the basis for certainty. To explain away the present, whether perceived as being either Order or Chaos.

Which is not altogether different than those who profess a belief in an ever-present Super-Evil Being (Devil) who works in tandem God to maintain the balance of the fulcrum that we call Life. Reality.

I’m familiar with the type of individuals who, in their mind or world-view, will fall to their knees in paroxysms of unintelligible babble at the tale of Moses heading into the hills, alone and unaccompanied; and with his account of his encounter with the burning bushes while up in the hills.

Oh! And while he was there he did manage to find time to think and put into writing ‘stone’ the ‘The Ten Principles To Having A Good Life’ – famously known as the 10 Commandments in some circles (I must add, Cecil B. DeMille and cast did a masterful interpretation of the events of that particular lore and recommend this version).

And, right there in the scriptures, the Devil is also an Angel. A fallen angel, it is stated. who walks among the believers. Belief in one necessitates the other.

And what about that Divine Tree of Life and all the lesser dieties/angels/demons, spirits and other minor players that spring to mind with such regularity? A cornucopia of Human imaginings.

To each their own. Fantasies or Beliefs. Heaven, Hell, Valhalla, Zeus – all imaginary human inventions. All of which amounts to records of the past as seen through they eyes of earlier ‘lore-keepers’. Historians.

We also know that ‘cast in stone’ doesn’t mean the same as it may have in the days of Moses and his wandering tribe. And, I suspect, that minstrels of the period would have wanted to know more about the burning bushes than the stone-age PDA ( Who needs 640KB?). But, hey, power to the people!

As an aside, Marley too saw burning bushes ( called them ‘erbs) and wrote some damn good music during his time here on Earth. Jah mon!

Since my initial introduction Dragons of A Summer Flame, countless hours immersed in the Dragonlance anthologies, if there are any conclusions I have drawn it is, The Lorekeeper, Astinus of Palanthas is the key to the entire narrative; and. Tasselhoff is the perfect foil (serendipity with a topknot and unbridled curiousity) to the machinations of Immortals. The Ultimate Historian (of any chronology I’ve ever encountered) is the ‘recorder-keeper’ of ‘Time’ [aka – Astinus the Lore Keeper)

Fancying myself as merely a ‘recorder’ of past events, viewed from the perspective of reviewing a collection of accumulated documents on a variety of subjects/topics is the result of what I (personally and modestly) consider to be a ‘database of cross-indexed publicly published views, comments, opinions or conjectures, etc., etc. All, (intent) content that is not exclusively my own (personal) is provided with as much attribution as I consider adequate (so as not to incur claims of ‘copyright’ infringement).

Acknowledging that ‘ignorance’ of the Law or ‘intentions’ is not a sufficiently valid basis of Defence – any perception of ‘violation or infringement – My self-serving intent is to provide my alternative sources of ‘referenced’ views, comments, opinions or conjectures that I ‘perceive’ as being Historical:The Present Through the Past..

The White Rabbit throughout the dialogues is (none other than) Tasselhoff. Hell of a guy!!

Go ask Alice …..

Note:
For the curious – ask me how I feel about Walt Disney Studios co-opting the rich tradition of classic lores as its own creations. Disney World! Right up there on my list of ‘not-in-this-lifetime (or the next’) places or products that I’ll not voluntarily support. My visceral dislike of anything Disney has no relationship to its ‘rumoured’ past CIA/Disney collaborations.

Those who don’t grasp that governments (globally) use the presence of mass-media to advance their agendas, some more subtly than others, (that is, those who choose to ignore the possibility that major corporations do engage in both social and political arenas) are deserving of their Disney-esque world of entertainment.

Undeniably, Bambi does get killed and Old Yeller does make one last howl before he bites the fateful bullet..So my advise, if asked about either of those two heart-warming stories by Disney, would be a) don’t play with your food; and 2) Everything and everyone die, eventually.

I lean more to the Bugs Bunny and Friends charaacterisations – Roadrunner and Wilie E. Coyote (life can be like that to some)? Pity Wilie all you may want to do. It’s deserved.

My biggest concern would be about the lousy products that ACME Company has been supplying poor ole Wilie. Equipment and products that arrive with, apparently, inadequate construction and operating documentation. Consequently, it results (WIlie’s Defence) in physical and psychological injury of Mr Coyte. (No one can be that stupid to keep purchasing shitty products from ACME Company.)

Sue! Sue!

Sylvester, Tweety Bird and Granny (a trio of favourites from Warner Bros.) reminds me of the juvenile delinquents Hansel and Gretel. Seriously, who pushes the granny into the oven after breaking into her home and raiding her pantry? We ought to be thankful they didn’t have access to something like an ‘unregistered’ firearm while being juveniles.

Suck it up snowflakes!

In both cases, don’t place all your trust in little ole grannies, not all ‘tweets’ are charming or something to that effect. We know who the Sylvesters and Tweeters are.

Perhaps, the tweeters need to turn their attention toward ACME Company?

Millennials Will Miss Boomers

https://www.salon.com/2019/05/12/why-millennials-will-miss-boomers-when-theyre-gone/

Why millennials will miss Boomers when they’re gone
Millennials mock Boomers as out-of-touch reactionaries. But we have lost a way of thinking they pioneered
Keith A. Spencer
May 12, 2019 11:30PM (UTC)

It has become en-vogue among millennials to mock the Baby Boomer generation as being out-of-touch, reactionary, and complicit in the destrcution of the planet.

The media loves sensationalizing this purported generational divide.

Business Insider recently published interviews with 21 different millennials explaining “why Boomers are the problem.” An Axios-SurveyMonkey poll found that 51% of millennials agreed that Baby Boomers had made things worse for their generation.

Meanwhile, there are multiple Facebook groups devoted to mocking the oft-inane or offensive images that Boomers share on social media. Though the media may play it up for clicks, the antagonism between the two generations is real and unsurprising.

Boomers, many of whom are parents to millennials, are not known for being computer savvy.

A recent study found that Boomers were 7 times more likely to share fake news through social media compared to adults under 30. Likewise, many Boomers struggle to understand the cultural mores of millennials and Gen-Z who belittle Boomers as “entitled” and “lazy.” Meanwhile, Boomers themselves struggle to understand the millennial disposition who have suffered through a historic recession, came of age in an era of increasing income inequality, home prices and debt that forecloses the possibility of a middle-class life.

There is a certain irony in the popular depiction of Boomers as out of touch, conservatives entrenched in their economic self-interests.

Boomers were, after all, originally the opposite: the Counter-Culture Generation who had lived through the Cold War, rejected the American imperialistic experimentation in Vietnam, stoked the Civil-Rights Movement and spread Liberal social ideals through music, art and culture – all of which was largely comprised of Boomers. A generation born between 1946 and 1964.

They were more radical than Millennials or Generation Z and included the Black Panthers, Young Lords, American Indian Movement, Students for a Democratic Society and the Weathermen – all semi-militant Marxist groups who saw themselves as connected to a larger International Socialist Movement — and were largely Boomer creations.

The New York Times noted that there were 4,330 bombings in the United States between January 1969 and April 1970, more than one every day. Leftist millennials may have coined the term “woke,” but our politics are downright tame compared to our bomb-slinging parents’ generation.

Not all Boomers were hippies or leftists. A great deal hewed to their parents’ conservatism and formed part of the “Silent Majority,” as President Nixon called his supporters. Still, the subset of activist Baby Boomers — the ones who self-identified as hippies and radicals, who rallied under anti-war banners and perhaps even participated in violent action against the government or corporations — are withering away and dying, while our generation reduces their existence to a punchline.

The millennial stereotype of Boomers, as computer-illiterate gullible reactionaries, is not only unfair but ignorant. It denies the radical political lessons that Boomers left a generation and which Zoomers have yet to learn. Specifically, there is a kind of anti-market, anti-capitalist mode of thought that hippies were particularly good at cultivating that could provide the key to a progressive future — and which the cynical Boomer generation has lost the ability to comprehend; largely because of the economic circumstances in which we were reared.

The Hippie Legacy

On my bookshelf, I have five volumes of course catalogs for the Mid-Peninsula Free University, dating from 1966 to 1972, which were passed down from my grandfather. Though only a few “Free Universities” or “Free Schools” still exist, the free school movement spread nationwide in the mid-1960s before slowly fading away in the 1970s.

The concept of a free school was simple: local community members could come together to organize to teach and take their own classes, generally for free (or a very small materials fee). Teachers were unpaid and classes took place anywhere — houses, parks, community centers, sometimes at the beach. Anyone could apply to teach a class and anyone could take a class. The goal was both to de-centralize the notion of school itself, and to challenge what the academy narrowly defined as knowledge.

The Mid-Peninsula Free University, situated in what is now called Silicon Valley, was one of the larger free schools in the country. And the course catalogs from that era provide a glimpse of what Boomers were thinking about, doing, learning and teaching at the time.

Though some of the courses taught at the Free University were comparable to what you’d find in a community college catalog — photography or drawing classes, for instance — many of them are far more transgressive.

The 1968 catalog lists courses on “Participatory Salad” (“A non-ideological approach to the preparation and consumption of salads, dressings, and related garnishments”), “Mind Unfucking” (“Religion – mysticism – music – nature – bacchanalian orgies – total recall with hypnosis + much more – timid souls welcome”), “Zen Beekeeping” (“We will discuss what beeing is”), “Experiment in Silence” (Have you ever tried to communicate with your mouth closed? This course will be a weekend experiment in which all forms of communication except talking and note-writing are allowed”), “Yippie Liberation” (“With colorful clothes, bells, beads, incense and long hair we will flow through plastic stores, unfree universities, summer schools and other habitats of unliberated minds and bodies”), “Yelling at the Pond” (the course description is merely a poem about water), and — my favorite — “An Evening of B.S., Etc,” whose description goes: An evening spent around the house BS’ing, drinking beer, wine, watching T.V., reading, relaxing, playing cards, etc. with whomever may be there. Intended to cure loneliness on Wed. evenings and to allow people to get to know each other under circumstances other than formal MFU course.

Even writing these out, I can feel my internal millennial scoffing and rolling its eyes. Growing up in an epoch of neoliberal economics and culture, millennials have been bred to feel that every moment of our lives must be monetized, or contribute towards future monetization, for our existences are so fraught and our economic circumstances so fragile. We lack the robust social safety nets, low tuitions and strong unions that undergirded the Boomer generation, and gave them the time to do things that weren’t relevant to their careers whatsoever: to sit around and “B.S.”, to yell at the pond, to experiment in silence, to ponder what bees are. These activities have no economic benefit whatsoever: they won’t make you a better worker, a more versatile employee. They’re just means of bettering oneself, waxing poetic, and pondering the world. There is a certain humanism at play here that millennials have forgotten — an empathy for other humans and animals, and the expectation of mutual goodwill. The hippies would probably have called it “love.”

In a recent Buzzfeed News essay, “How Millennials Became the Burnout Generation,” writer Anne Helen Petersen accurately describes why millennials have so much trouble “adulting,” to use a slur leveled at us by Boomers. Petersen traces millennials burnout to the way that we were raised:

As American business became more efficient, better at turning a profit, the next generation needed to be positioned to compete. We couldn’t just show up with a diploma and expect to get and keep a job that would allow us to retire at 55. In a marked shift from the generations before, millennials needed to optimize ourselves to be the very best workers possible.

Petersen explains how the millennial generation was the first that had been “trained, tailored, primed, and optimized for the workplace — first in school, then through secondary education — starting as very young children.” “Depending on your age, this idea applies to what our parents did or didn’t allow us to do (play on “dangerous” playground structures, go out without cellphones, drive without an adult in the car) and how they allowed us to do the things we did do (learn, explore, eat, play),” she writes.

The aftermath of this kind of childhood, one in which education and even leisure time were geared towards one’s future ability to be a laborer, had some detrimental psychological effects on us as kids. We millennials were taught that our ultimate purpose on Earth was to find what Petersen calls “The Job” — a dream job, the job that defines who you are as a person. “[Students] were convinced that their first job out of college would not only determine their career trajectory, but also their intrinsic value for the rest of their lives,” Petersen writes, observing how the culture of college campuses has changed:

Not until I returned to campus years later as a professor did I realize just how fundamentally different those students’ orientation to school was. There were still obnoxious frat boys and fancy sorority girls, but they were far more studious than my peers had been. They skipped fewer classes. They religiously attended office hours. They emailed at all hours. But they were also anxious grade grubbers, paralyzed at the thought of graduating, and regularly stymied by assignments that called for creativity. They’d been guided closely all their lives, and they wanted me to guide them as well. They were, in a word, scared.

I very much relate to this feeling, for I, too, am scared. And no matter how successful my career path is, I cannot stop feeling that way. I graduated from college in 2009, at the bottom of the trough of the recession, and applied to 60 jobs before getting the lowest-paid ($9/hr) and most distant one, a 50 mile commute from where I lived. I never made more than $20,000 in a year until I was 27, and never had a job I actually enjoyed until I was 29. In that time I had to defer and fend off $45,000 of student debt. I was almost always on food stamps. I kept a spreadsheet of every job I had ever applied to, now hundreds of entries long, to study how and why I was getting rejected. My twenties were largely miserable, so consumed was I with anxiety about school, work, debt, and networking: am I doing the right things to get my dream job? What if I’m not doing enough? What if this move, or this career choice, or this degree makes me less marketable or less attractive to employers?

The acquisition of a middle-class job — particularly one that I actually enjoyed, which I had been told, was the goal of life — was so searingly difficult that I felt constantly burned-out, even when I was unemployed. If someone had invited me to a course in Experiment in Silence, or to Zen Beekeeping, I would have chuckled. I had better things to do, things that might actually help me find a real career. That I can’t connect to the hippie mindset is unsurprising in this regard.

The fact that millennials have trouble connecting with hippie culture also explains to some extent why the humanities are dying. The kind of philosophical reflection and humanistic thinking innate to the humanities aren’t applicable to the business world, at least not directly; they won’t get you a job unless your boss is a sympathetic former humanities academic. That our society has become increasingly supremacist about STEM skills, while mocking the humanities, seems to relate to our lack of free time, our obsession with work and with monetization.

As the Mid-Peninsula Free University attests, Baby Boomers were the last generation that was able to conceive of a world where capital didn’t govern all aspects of our lives. In-between Generation X found its rebellion in “slacker culture,” rejecting the Reaganite yuppie work culture; millennials went the other direction, all-in on the workplace, as we were told since we were young that this was the purpose and function of our existence.

Hence, millennials often spend our leisure time on activities that will have the ancillary effect of making us money in some way, or making us more marketable as employees, or getting us closer to that dream career, or helping us stave off poverty. Consider social media, a near-universal hobby of my generation. Twitter may be “fun,” but I also know if I get enough followers, it can serve as a safety net if I get in an accident and need to raise money for medical bills. Instagram and Snapchat and TikTok are similar — in that having a great number of followers can help you sell your art, music, wares, crafts, whatever.

The near-universal desire to go viral — an impulse that drives many millennial artists — means that monetizing all aspects of one’s existence has become normalized. Millennial DIY and “maker” culture is all about innovation and thrift, means of making oneself more economically self-sufficient. Millennial wellness and fitness culture is frequently synecdoche for self-improvement and career success. So much of what is popular among my generation is also an economic calculation, designed to make us better workers, or have more economic opportunity, or to serve as a safety net in place of the one that conservatives destroyed over the past few decades.

Millennials haven’t forgotten how to live without monetizing our existence. Rather, we never knew how to in the first place. There has never been a social safety net for us, at least not one of much import; unions haven’t been widespread, college costs are so prohibitive as to put many of us in debt bondage, public housing is nonexistent and Obamacare is a flawed “free market” solution that fails many.

My greatest fear for my generation is that in the process of forgetting how the hippies thought, we will forget that love is crucial to any actual progressive-left movement. You can see the seeds of this in how online discourse around politics happens: those on the opposing side are castigated as irredeemable. Yet a truly progressive vision of the future has space to redeem those who are bigoted in some way, recognizing that they became so in part because the right has gotten to them first; worse, the left never showed them comparable compassion, nor sought to truly understand the material underpinnings of xenophobia and bigotry in a way that could connect to those who had been ensnared by hatred.

This is how politics may die if we forget the lesson of the Boomers. A lack of love and an inability to perceive the world except in marketable terms will doom any left-progressive movement. The hippies may have grown complacent and reactionary in their old age, but they were right, at least, about the need to cultivate a counter-culture that opposes the mainstream culture engendered by capitalism. We cannot envision a post-capitalist future if we do not first experience it among ourselves; if even our leisure activities are centered around monetizing our existence, there is no hope for an alternative.

Keith A. Spencer is the cover editor for Salon, and manages Salon’s science, tech and health coverage. His book, “A People’s History of Silicon Valley: How the Tech Industry Exploits Workers, Erodes Privacy and Undermines Democracy,” was released in 2018 from Eyewear Publishing. Follow him on Twitter at @keithspencer

Privacy In The Era of Digital Networks

The Expectations of Personal Privacy In An Era of Digital Information Networks

13/06/2019 @ 07:52Am EST

I recently read a couple of articles on two different subjects that can be tied to each other through an underlaying focus: the impact of current digital technologies and their contributions to the erosion of personal privacy.

The first article, Why we can never put the Big Tech monster back in its box [ https://www.cbc.ca/news/business/tech-giants-reform-1.5170642 ] addresses the question from a business/financial perspective: business monopolies and legislative controls.

“In its quest to eliminate the digital gatekeeper monopolies, Congress may face a number of insurmountable problems. One is the speed with which technology transforms. Any new rules by the plodding legislative and judicial process could fail to catch up with an industry that has already moved on.”

“A related problem is the complexity created by digital connectedness. Problems like security and privacy, high on the congressional agenda, are not something that can be solved once and for good. Beyond economics, social and civil discourse have been changed forever: Instant information in your pocket. The ability to share your views with a wide audience. The ability to try to manipulate the political dialogue — for good or for ill.”

“With our urge to watch the same team play, to shop in the most convenient way, to stream the most popular videos, read the hottest news and talk in the forums where all of our friends are talking too, it may be that we’ve all contributed to creating a monster that will be hard even for Congress to kill.”

“The current landscape suggests there are only one or two significant players in important digital spaces, including internet search, social networks, mobile and desktop operating systems, and electronic book sales,” the head of the U.S. Department of Justice’s antitrust division, Makan Delrahim, said in a speech this week, titled Antitrust Enforcement and Digital Gatekeepers.”

Don Pittis
Business columnist ©2019 CBC/Radio-Canada. All rights reserved.

Tech News Recaps

Digital Worlds

The Megaprocessor – A CPU You Can See In Action

About 10 years past, this is what was making the headlines – with respect to questions about Government surveillance and individual privacy issues.

For more than a decade now, Americans have made peace with the uneasy knowledge that someone – government, business or both – might be watching.

We knew that the technology was there. We knew that the law might allow it. As we stood under a security camera at a street corner, connected with friends online or talked on a smart phone equipped with GPS, we knew, too, it was conceivable that we might be monitored.

Now, though, paranoid fantasies have come face to face with modern reality: The government IS collecting our phone records. The technological marvels of our age have opened the door to the National Security Agency’s sweeping surveillance of Americans’ calls.

Torn between our desires for privacy and protection, we’re now forced to decide what we really want.

What’s being done with your data: Experts ask, shouldn’t someone get this under control?

Now that Facebook, Google and Amazon know pretty much everything about us, how they’re using that information is drawing the focus of politicians throughout the Western world, asking in effect: “Shouldn’t something be done about this?”

It’s not as though none of this was unforeseen.

What are Web services anyway?
By David Berlind
February 12, 2002

“Web services” is nothing more than a fancy moniker for “big honking API (application programming interface).” Consider what xDBC (replace the x with O for “Open” or J for “Java”) and SQL (structured query language) did for databases. They made it possible for just about any software – reporting tools, spreadsheets, even word processors – to extract query results from any database. Any software. Any database.

Sun and Microsoft compete for IDs
By Connie Guglielmo & Charles Babcock
October 3, 2001

When Sun Microsystems took to the pulpit last week to propose an alternative to Microsoft’s Passport, the move marked more than just another showdown between the technology industry’s two fiercest rivals.

This time, Sun and Microsoft are on a bigger quest: to create a standard for digital IDs. One of the Holy Grails of online computing, the digital ID has been touted as the magical key that will unlock the Web and turn it into a wonderland of convenient, personalized services, while warding off crooks intent on stealing personal and credit card data from unsuspecting online users who want to live, work, and play in the virtual world.

Sun challenged Microsoft’s Passport by launching the 33-member consumer-oriented Liberty Alliance Project, which will supply online user IDs and authorization. Sun announced the venture in New York City, which is still reeling from the Sept.11 terrorist attacks.

As the U.S. continues to cope with the aftermath of the attacks, better forms of identification–digital IDs as well as a possible national ID authorized by the federal government–are being mentioned by some as one of the many cures for the nation’s security ills.

Facebook travelled in the internet’s grey areas and now it faces a reckoning

Stuart Thomson
April 2, 2018 11:26 AM EDT

Your Google Play music tastes, your entire YouTube watching history, all the places you’ve used your phone, which can be displayed on a Google Map that places red dots on your whereabouts over the years. You can see all the ads you’ve clicked, the photos you’ve uploaded, the emails you’ve sent.

Facebook keeps a record of all your login attempts, with information about where, when and what device you used. Facebook knows all your friends, your likes and dislikes and it predicts your tastes based on what it knows.

It’s the digital equivalent of walking into a stranger’s house and finding zoom lens photography of yourself plastered over the walls. You’ll want to run screaming from the room and never look back.

These features can usually be disabled, but most of them are turned on by default. And Google and Facebook aren’t the only ones doing it, they’re just the best at it.

This kind of data is the lifeblood of the internet and anyone who’s ever thought for a minute about the business models driving all these companies should understand what’s going on. Google’s parent company has a value of $700 billion and Facebook has been hovering around $500 billion, based on products that are free for most users. There is an axiom for that: If the product is free, you are the product.

And it’s fair to say that Silicon Valley saw this coming.

“You have zero privacy anyway. Get over it,” said Sun Microsystems CEO Scott McNealy.

That was in 1999, when Google was only a few months old, eight years before the iPhone and four years before Mark Zuckerberg started his “FaceMash” website in 2003 which allowed Harvard students to pass judgment on the hotness of their classmates.

A year later Facebook was born and a company called Cambridge Analytica was founded nine years after that, in 2013.