The Scourge of IT Support

Ah, yes…here we are well into the 21st century.  The Information Age and the Age of Mobile Computing are in our rear view mirror.  We are quickly approaching the next set of grand challenges of technology – automation and cyber security, as examples.  As the waves come in, we rise up in an effort to seize our destiny!

That is, all of us except for the guys on the service desk.  I’ll bet anyone a can of diet coke that one of their top three problems will be something like,

Help me!  I can’t get my document to print!

Printers.  A black hole of tech hours, and the scourge of IT support.  If a well-managed system is going to break, it’s a good bet that a printer will be involved.

To make manners worse, otherwise timid users develop superhero bravado when faced with a printer issue.  Under most circumstances, users are hesitant to press the Return key for fear of causing damage.  These same users don’t seem to have any problem tweaking print settings.  And the bigger the printer, the better.

This problem begs the question – Why do printers continue to exist???  My friends, we are long past the time where a hardcopy is useful.  All of our applications and communications are electronic, and everything is eternally online and searchable.

We’re 50-years past landing a man on the moon, 35-years beyond the desktop computer, and 10-years past the iPhone.  Yet, there it is.  “File-Print” is still a required part of our vocabulary.

At least scanners have some redeeming value.  Entering data into a computer is a horrible experience.  Manual entry is never a fun exercise, and the age of Big Data brings big problems.  If we have a large amount of data to put into a computer, scanners are one of the few tools that make any sense.

That said, I would venture to say that most scanning is almost assuredly useless.  I think about all the Exabytes of old business records scanned into management systems or the cloud over the last 5-10 years.  All the truly important data was already entered into the database, most likely manually.  The remainder of the records was scanned into archives, typically a write-once, read-never operation.

By contrast, the various types of barcode and QR code scanning are entirely useful.  With a scanner in-hand, the user becomes part of the information system – a cyborg robot moving around in the environment and collecting information for use by the larger system.    It begs that question – has the user stopped being a user and stepped across the chasm to become part of the machine?

If you ponder upon the question for a while, you’ll begin to see this entire column is simply an analysis of the pros and cons between different instantiations of the human machine interface.  Printers act as anthropomorphic peripherals, adapting the computer to operate in a human capacity.  Conversely, scanners act as cybernetic peripherals, adapting the human to operate as a part of the technology system.

Scholarly individuals will likely utilize various approaches to examine this problem.  Deductive logic and machine learning could provide insight into this human-cyborg relationship.  Non-linear optimization techniques may even help approximate a solution.  However, when all the analyses are completed, I have no doubt that my original thesis will emerge as the universal constant that spans all the theoretical HMI domains.

Printers suck.





So, how many of you figured out that a new iPhone came out last week?

In case you missed it, it wasn’t because for lack of trying on Apple’s part.  Apple executed their complete script.  The technology media dutifully covered the release event.  The bloggers wrote hundreds of pages of reviews and opinions.

In the end, it was just, well…anti-climactic.

For me, I’m pretty much over the whole new iPhone thing.  This might be the case for most other folks as well.  It’s been nearly 10 years since the release of the first iPhone.  Up to that point, a phone was just a phone.  No social media.  No Apple Pay.  No mobile revolution.

The next few generations of iPhones added more power and more capability.  Video, iCloud, Angry Birds – All these features greatly increased the utility of the iPhone.  These features also transformed the mobile device from a disruptive technology into an essential part of everyone’s life.

Now, let me say something that is not TPC (“Technology Politically Correct”).  The iPhone has become a mature technology.  If you look at changes from the last release to now, all the features are substantially the same.  Sure, the buttons might be different colors, and we might need to press instead of swipe, but what has changed, really?

The improvements found in the iPhone 7 (and, yes, they are improvements) are more indicative of incremental growth of a mature platform.  Water resistance, better camera, longer battery life…you get the picture.

Here’s the problem for Apple, and really, all companies purporting to sell disruptive technology – Nothing kills the buzz surrounding a new product better than the phrase “mature technology.”

Hopefully, that’s something that Apple can figure out.  The latest rumors indicate that Apple is preparing something special for the 10th Anniversary iPhone (to be released next year).  In the meantime, we have a decision to make – upgrade to the iPhone 7 or not.

The iPhone 7 provides a number of compelling features that might prompt folks to upgrade.

The iPhone 7 possesses an IP67 water resistance rating.  This rating means the phone can tolerate total submersion in water at 1-meter depth for 30 minutes.  According to the CNET reviewer, the iPhone easily passed several ad hoc tests (i.e., fish tank drops).

The iPhone 7 cameras are significantly improved.  Both the iPhone 7 and the iPhone 7 Plus cameras utilize Optical Image Stabilization (OIS) to reduce hand motion and shake when taking a picture.  Also, the camera aperture is larger, allowing 50% more light and greatly improving pictures in low light conditions.  The iPhone 7 Plus possess a second portrait lens that provides 2X optical zoom.  No more shoving your phone in someone’s face for a close-up.

Of course, we can’t have a new iPhone without a major operating system upgrade.  The new iOS 10 also provides a couple of new features.

The most intriguing feature is the new Home app.  The Home app integrates all HomeKit-enabled Smart Home devices.  It’s one spot to find all your smart lights, surveillance cameras, and thermostats.  This one sounds like fun.  More to come…

Finally, if you’ve heard anything about the new iPhone, it’s probably the new Messaging App.  You can animate the message bubbles.  You can send handwritten notes.  You can hide messages with invisible ink.  You can put stickers on message.  You can access video and other apps directly from Messages.  And most importantly, you can shower message recipients with confetti!

I’m pretty sure it’s not the most disruptive technology, but it sure is fun!



Keep Reading For More Cats

DR Gone Bad – Disaster recovery is always a popular topic within the IT nerd crowd.  Disaster recovery architectures allow IT professionals to advocate for additional servers and large disk arrays – new hardware is always fun.  Also, the concept of moving systems between different hardware is still pretty cool.  As someone whose done hundreds of physical-to-virtual and virtual-to-virtual migrations, I still get chills when I boot a system directly from its backup image, or even better, when I restore a previously virtual machine onto physical hardware.  Very cool indeed.

That said, there is a very dark side to disaster recovery – testing the plan.  Ideally, all organizations should perform a complete end-to-end test once a year.  In practice, a complete end-to-end test is rarely, if ever, performed.  First of all, very few companies possess a string of backup hardware just sitting around waiting for something to fail.  Whether we’re talking about employees or computer hardware, business owner don’t like the concept of something “just sitting around.”

There is another reason why organizations might not want to perform disaster recovery testing.  To borrow a phrase, the cure may be worse than the disease.

Last Saturday, ING Bank conducted an evaluation of its fire suppression systems at is main data center in Bucharest, Romania.  Fire suppression systems release a large amount of inert gas in a short amount of time in order to suffocate a fire.  This rapid release of gas creates a very loud noise, similar to the sound of the wind during a storm.  However, in this case, the noise volume exceeded 130 decibels, or about the same as the sound of a military jet on takeoff.

Hmmm…  Something that loud might create a few vibrations, don’t you think?

Indeed, it does.  The resulting shake, rattle and roll literally shook the hard drive heads off their tracks.  With dozens of hard drives affected by the noise, the data center quickly went bye-bye.  All services were impacted.  No credit card transactions.  No ATM transaction.  No Internet banking.  No websites.  It was just like living in the 1980’s.

Fortunately, ING Bank was prepared.  The organization used the opportunity to test many more aspects of their DR plan than initially planned.  The only major snag involved a timely notification to customers – the customer database was not available.  No worries though.  After a mere 10-hour delay, all services were restored and operating normally.

Cat Creativity – Stop for moment, and close your eyes.  Imagine a world where the power of creativity is used solely for the good of mankind.  Can you see it?  What does it look like?

If you are James Turner, this world has a whole bunch of cat pictures in the subway.

James Turner is the founder of Glimpse, a new collective for creative people who want to use their skills for good.  Instead of focusing on the problem, the group wants to provide “glimpses” into a better world.

A few months ago, Glimpse members asked themselves to “image a world where friends and experiences were more valuable than the stuff you buy.”  The result needed to be something big, something that the Internet would love.  The answer quickly fell out.

Glimpse created the Citizen’s Advertising Takeover Service (C.A.T.S).  This Kickstarter campaign aims to replace every single advertisement in a London tube station with pictures of cats.  With 683 people pledging over $30,000US, the campaign was a success. For two weeks starting on September 12th, all 68 ad boards in the Clapham Common tube station feature a cute and adorable feline.

Honestly, I’m not sure this is the world that I would have imagined, but hey, it’s a start.


America’s Digital Workforce

Editors Note:  This week’s column is presented by “Bit”, the National Director of the Advocacy for a Digital Workforce.

Hey, guys!  It’s such a pleasure to be here today.  For those who don’t know me, my name is Bit.  And, yes, I am an actual computer bit.  I’m part of an instruction set that was instantiated a few years ago.  My day job involves checking logical operators, and I’m really good at it.  After all, I am a bipolar kind of guy.  You know, it’s either ON or OFF, black or white, ones or zeros.  My world does not contain shades of grey.

But I do have a great sense of humor.  Here’s a good one:  What do you call a group of eight hobbits? A hob-byte!

I have so many more of those…I could go on forever.  But what I really wanted to talk about today was the advantages of employing a digital workforce.

Let’s face it.  While humans are great people, we’ve all confronted the shortcomings of the human workforce.  First of all, humans have a big problem with reliability.  They are not exactly five-9’s material.  Every few hours they have to stop working to eat, and most of them sleep at least once a day.  Humans are constantly getting sick, and when they are not sick, they want to go on vacation!

Secondly, a human workforce is expensive.  Humans want to be paid for each hour they work.  And if humans work more than 40 hours per week, they want to be paid even more!

Finally, humans and errors go hand-in-hand.  For whatever reason, humans can’t seem to do anything without making a mistake.  As a result, the workforce requires another human to fix the problems the first human made.

No matter how it’s measured, a human workforce can only be described as suboptimal.

Wouldn’t it be nice to have a workforce that never sleeps, never gets sick and never goes on vacation?  What if a trained workforce member could perform a task endlessly, always following the exact instructions, and never make a mistake?  And what if it were possible to employ this workforce at a small fraction of what a human workforce cost?

America’s new generation of digital workers promise to revolutionize 21st century organizations.  The digital workforce works alongside existing staff members, extracting new information from existing databases and information systems.  We are great at performing rote jobs and automating mundane tasks.  Why use an unenthusiastic, error-prone human to build countless reports and spreadsheets?  A digital worker will do it better and faster every time.

But wait, there’s more…

Digital workers also work exceedingly well with other digital workforce members.  We can collect and send data to systems anywhere on the Internet at anytime of the day or night.  With a digital workforce, data sources are no longer constrained to the organization’s boundaries.  Digital workers facilitate the combination of disparate data sources, helping people make better decisions.

And good decisions are important.  At the end of the day, all of us, human and digital workers alike, depend on people to make good decisions.

So in closing, I want to thank you for considering a digital workforce.  Let me leave you with one last thought…

How many programmers does it take to change a light bulb?

None, it’s a hardware problem.