• Beer Fridge
  • Home
    • December, 1919
  • Me?

Literature and Libation

Menu

  • How To
  • Libation
  • Literature
  • Other
  • Writing
  • Join 14,886 other followers

Browsing Tags computers

PSA (Public Spelling Announcement)

February 4, 2011 · by Oliver Gray

Attention all users of the internet;  forum trolls, thread stalkers, meme promoters, and other tube denizens:

Although they melt my soul, I can forgive mistakes associated with “there/they’re/their”, “where/wear”, “it’s/its”, “affect/effect”, and even the egregious “loose/lose”. All of these can be explained by a slip of the mind, a marked lack of education, or even inebriation. They are unfortunate and regrettable, but at times, forgivable.

In comparison, spelling words incorrectly makes you look dumb. Incredibly dumb.

I read thread replies, video comments, and (most tragically) blog posts that look like someone injected Novocaine into their hands and just let their limp, lifeless appendages fall all over the keyboard for a few minutes. I cannot focus on the content of the writing, and more importantly the message trying to be conveyed, when every other word is spelled so badly that my brain dies a little.

I focus on spelling because unlike grammar, syntax, or diction, there is no excuse for spelling things incorrectly. Even if your little pygmy brain can’t remember the proper order of letters in basic words, there are so many tools available that automatically correct your spelling that it borders on absurdity.

Using correct grammar requires some cognitive processing, a concept that I acknowledge a lot of people in the English speaking world are not comfortable with. Correct word choice requires actually knowing what specific words mean, and I can let a below average lexicon slide. At the very least, incorrect grammar and diction can be pretty comical (especially in the case of extreme malapropism), giving it some redeeming character.

Spelling things wrong is just plain unacceptable. It’s not funny, cute, or even remotely endearing. All it does is make you look like a lazy imbecile, whose writing I shouldn’t bother wasting the eye-energy reading. Abbreviations that are just as many characters as the actual word, or mutate the word to add a letter that isn’t even in the original word (cuz, cos) are why people are driven to drink.

Mozilla Firefox, Google Chrome, and Apple’s Safari all have built-in and enabled-by-default spell checkers. Internet Explorer has about 40 plugins available for download that will let you know when you’ve failed at communication. Voluntarily using IE makes me question a person’s capacity for development anyway, but I digress.

It is simple. Makes friends with these:

Introduce yourself to the red squiggles. This universal plague-mark of misspelling is here to help you not look like a third-world degenerate; you should thank it and buy it nice presents (coincidentally, the red squiggles live at my house, so feel free to send the presents there).

If you have been up until now ignoring these little red lines, claiming ignorance as to their purpose, I implore you to recognize their existence and importance. They don’t just show up for fun to make your writing more colorful; consider them screaming alarms that your words are in physical pain. A misspelling is like a wound on the word, this red line like its veritable lifeblood pouring out and pooling underneath. Ignoring these lines is like sealing the fate of these poor, malformed words, and any hope of people taking you seriously.

If your documents, posts, or other assorted keyboard regurgitations start to look like this…

…you should probably try right clicking each word that is underlined with red squiggles and choosing another word from the list that appears. Even if you don’t know what the word that appears means, the computer is clearly smarter than you and you should probably do what it says.

Notice there is a blue line (and sometimes even green lines!) underneath the word “no”. This means that the computer has noticed that you spelled something correctly, but used it incorrectly. Good try!

P.S. For the record, in disclaimer to the second line of this post, I can never forgive “then/than” mistakes. It’s not even kind of hard to know which is which.

The Luddite Dilemma

January 12, 2011 · by Oliver Gray

For those of you unfamiliar with the etymology of the term, the Luddites were a group 19th century British textile workers who physically and politically opposed the Industrial Revolution. It was their belief that the mechanization of production would eliminate jobs, encourage shoddy workmanship, and ultimately cause the downfall of the entire industry. The term has evolved since and now is used to reference anyone who is opposed to, or incapable with, modern technology.

Many people use the term in self-deprecation; suggesting that they, in their misunderstanding and incompetence, somehow represent the Luddites in their quest against progress. While I appreciate and admire the use of an archaic term, this comical application doesn’t quite encompass my definition of the word. To be a Luddite, one must be vehemently opposed to (even if only subconsciously) technology, automation, and most importantly, progress.

Those who know me might say that I am in some ways a Luddite, as I have voiced concerns over the necessity of things like smart phones and 3-D televisions; but I never argued that these things should not exist. I recognize the impact that smart phones have had on our culture, and do not oppose the idea, I simply oppose the reliance on a singular gadget by so many for so many things. I worry that a generation raised with all the world’s knowledge in their pocket will never know the joys of reading a book and having to go find a dictionary to look up a word; hell, they may never even need to read a book, which I find quite depressing. It is a different matter to understand and be able to functionally use a piece of technology and oppose it for educated reasons, than it is to disdain and denounce something because its strange magicks are like voodoo to your voluntarily primitive mind. The latter describes my kind of Luddite, and unfortunately, their breed is just as prevalent today as it was at the turn of the century.

At a point in the late 90s, it was sort of cute to be a technological dunce. People made jokes that they didn’t know which button turned their computer on, that their CD-ROM drive was really a cup holder, and the internet consisted solely of animated GIFs of fire and men digging figurative HTML holes. We, as a culture, accepted this attitude, especially from an older generation of people who had never needed a computer and had little to no experience operating one. It was like a 16 year old learning to drive a car; we laughed lightly at their attempts to parallel park but knew someday they would master at least some of the subtleties of driving.

The analogy between computers and driving stops there. While it is a given that most people will eventually figure out enough about driving to not crash into something every time they turn the wheel, the same cannot be said about people who fire-up their computers. The mindless majority often don’t realize  that they are driving around a controlled explosion, nor do they really understand how their vehicle works, but at least they understand how  to use it (ie. push the pedals and turn the wheel) and what not to do to it (crash into things). No such assumption can be made about someone with a computer; owning and using a computer does not guarantee the development of an appropriate skill set. Somehow in the cosmic chaos, education on computing was left as an optional check box, which most people left blank.

My experience as a desktop support monkey has provided me with years of anecdotal proof of this strange phenomenon, and my current position as a young professional in a sea of old-schoolers has seen daily frustration at the hands of those with an aversion to technology. It seems strange that a tool which promotes efficiency and convenience would be so widely misused and under appreciated, but too many people, a lot of them who can’t even claim age as an excuse, seem to be willfully ignorant when it comes to anything that has to do with a computer. The sad fact is that this aversion is no longer OK; computers are no longer a cutting-edge, fringe concept that can be ignored, they are integral to functioning normally in this new age. Being old-school only works if you are actually old.

A computer is, despite its complexities, a tool. If your life required you to constantly adjust screws, after a while, you would figure out how to skillfully use a screw driver. It would take a very special mind to struggle with the concept of a piece of metal that you rotate in your hand. While a computer is a much more sophisticated tool, the basic principle remains the same. After months (sometimes years) of daily use, a user should learn what their computer does, and why it does it. They should also learn what it cannot do, and what happens when you do the wrong things. Ultimately, they should develop an understanding, without any formal training, of how their tool functions and in what capacities. To not garner any insight after years and years of using a tool either suggests that the person is incapable of learning at all, or for some reason actively refuses to learn anything about computing.

The latter has to be the truth, otherwise we have to surmise that we live in a society where there are millions of people somehow surviving with debilitating learning disabilities. Since that is obviously not the case (ignoring Jersey Shore fans and the entire {and future} cast of 16 and Pregnant for the moment) there has to be a deeper reason as to why the normal, heuristic method of learning does not apply to computers. My only guess is that somehow, the complicated roots of very early computing still vex everyday users, who simply refuse to acknowledge that using a computer is now easy. I truly think the majority of people manifest their own destiny with the presumption that using a computer is beyond them, as only highly skilled nerds who dedicate their lives to the mystical intricacies of coding and software development can possibly use such a dense piece of machinery.

This overarching concept is what companies like Apple built their entire marketing platform on. When I see an Apple product, all I can think is, “Who cares how it works? It’s pretty and it just does.” Apple removes the fear of using a computer by taking away any challenge or risk and they have fallen all over themselves to prove this to their target audience. Don’t want to deal with the terrifying (but easily avoidable) world of VIRUSES?!?! Get an Apple, we don’t get scary viruses, so you’ll be fine. People love this concept; the imaginary complications are taken away, and suddenly, they are masters of their technology.

Unfortunately, with safety comes limitation. Apple tends to “lock people in”, telling them what software they can and can’t use, forcing them to purchase everything related to their computer through them, and ultimately taking away any freedom of computing. Their clever guise of accessibility and safety obviously works, but it does nothing to solve the original problem of people fearing their computers. The irony is that the same feeling of safety can be achieved using any operating system on any computer from any manufacturer.

Awareness is key. The majority of problems people experience comes from them not knowing what they’re doing, but more specifically not caring that they don’t know what they’re doing. I use a Windows based PC, spend a lot of time on the internet, don’t run any virus protection software, and yet  – gasp – I never get any viruses. How then, do people with Norton, McAffee, Avast, Kepersky or any of the other hundred Anti-virus protection suites manage to get dozens of malicious objects every month? I do have a passion for computing, but I am hardly more intelligent or dedicated than your average user. The difference is that over time, I have learned not to download attachments from people I don’t know, I’ve learned what websites are sketchy simply from a glance, and I’ve learned that the person using the computer makes all of the decisions, not the other way around.

I am sick to death of people claiming they “did nothing” to their computers. I hate to tell you, but if you just plugged your computer in and turned it on, it would do absolutely nothing until a piece of hardware died. That could take years. When your computer “acts up” or “has a mind of its own”, it’s because of something you (or someone who used the computer) did, not because there is a goblin living inside of it who is hell bent on ruining your day. There a relatively few problems that are caused solely by a piece of software going ballistic, and these only usually manifest themselves after the user has thoroughly abused their machine. It is not only time to embrace computer education, but also time to stop diffusing the responsibility of computer problems by claiming some invisible, malevolent force screwed it up without your knowledge. There is nothing magical about a computer; it works just like your toaster – bread in, toast out. If you pour Gatorade into your toaster because you don’t know better…don’t expect toast.

If a person is lagging behind the norm in 2011, chances are they will be behind until they die. It is almost too late to try and play catch up now; if you were too slow to fully grasp file formats, basics of websites, word processing, and the difference between CC and BCC, by the time you do, there will be a hundred other things you have to learn. I am not saying that a person shouldn’t try to educate themselves, nor do I expect everyone to be able to solve any problem that ever arises related to technology. I do wish people would embrace, instead of eschew, what is undoubtedly the future of American society.  Sooner than later it won’t be “regrettably endearing” that you can’t function around a computer, it will be unacceptable and ultimately make you look stupid.

So to all those Luddites out there actively or subconsciously trying to avoid learning something new, I say wake up. The digital age is no longer dawning. It has long since dawned, it is about 1:42 PM, and the midday sun is shining on your still-sleeping face.

  • Blog at WordPress.com.
  • Connect with us:
  • Twitter
  • Facebook
  • RSS
Cancel

 
Loading Comments...
Comment
    ×