Spam on LinkedIn

It should come as no surprise that our modern day ability to communicate involves some communications that we might find disruptive. Spam is a reality. Generally, it’s not a big deal. It’s easy to spot and easy to ignore. Sometimes there’s enough of it that you want to do something about it.

Yesterday there was enough of it that I felt something had to be done.

Among my other completely normal LinkedIn messages I had 3-4 spam messages inviting me to pay for training that would lead to professional certifications. The messages are usually pretty formulaic. Here’s an example:

Hello How are you?

welcome to my LinkedIn Family 😊

would you like to go for PMP, Six Sigma, Scrum Master, CAPM, ACP ITIL Certifications? or any other Training and Certification?

I do try to treat spammers as though they’re real people. I want to take up their time (so they don’t go after those less gullible) but if I ever met the person on the street I don’t want them to have feelings of wanting to beat me up.

Most problems in a workplace are caused by miscommunication, so I’ve found it fun to intentionally miscommunicate with spammers. The other thing you have to do is help make it easy for them to spot that something isn’t right–like asking for a larger amount of money than normal. In the case above they were inviting me to attend training. I responded as if they were inviting me to teach training. I also claimed I would need more money than they probably budgeted for a guest speaker.

I’d be happy to assist with any of your classes (if it’ll work into my schedule).  I’d like to be up front.  I expect 2x the government per-diem rate in addition to my normal hourly fee. Please let me know locations, dates, and times you’d like me to assist and I’m sure we can try to work something out.

If they write back I can’t wait to tell them about my hourly fee and travel expectations. Normally I fly coach, but for these people I expect to fly first class.

What’s your best story of dealing with spam?

The Best Kind of ERROR

Our modern experiences have taught us to be familiar with different types of error messages. These are not always unpleasant. They’re just part of the feedback loops of our modern ecosystem. Various projects have even attempted to make this a more enjoyable experience. In Chrome you can play with a dinosaur while you wait for your internet to be restored.

If errors are normal (and they most certainly are) what is the best type of error?

I’d like to submit that the best type of errors are user errors.

Yes, people failures are the best types of failures to have.

Yesterday I was on a call where we spent 45 minutes troubleshooting an error message that we could have easily cleared out by hitting the enter key twice. We just didn’t know that at the time.

Sounds like a waste of time, right? It felt like it too.

But that’s the exact opposite of what it was. A waste of time would have no positive benefit.

Almost at once we were all in the zone of trying to get past the message. We were all in our learning framework. We activated social networks to help us troubleshoot. We were all focused on the same obstacle. It was a great unintended team building activity.

When we did discover the solution there’s not a single one of us that didn’t learn what it was. We were all in our learning mode, and the hard to find answer helped to solidify the solution in our memories.

We’re not going to do that again.

The other reason why I believe user errors are the best kind of errors is that users can be trained. In the above example we’re not going to make that same mistake again and we left notes and videos for anyone that followed us so they don’t have to learn the same lesson with the same frustrating feelings.

Equipment and software aren’t so quick to fix. There’s a logistical chain that has to be considered when addressing an equipment error. In software there’s a development chain that has to be considered. In my experience neither of these have as quick of a response of asking someone to do it differently.

What’s the best type of error? User Error.

Because users can be trained and once they are they’re changed.

So, dear reader, are you easy to train?

Ignoring Stakeholders While Upgrading

When the United States Army isn’t deployed fighting a war, they’re supposed to be training for it.  As a part of the Executive Branch they are required by law to account for its use of authorized funding.  As the fiscal times reported in March of 2015 the DoD can’t account for $8,500,000,000,000.  One of the areas where the Army has tried to improve its accountability is in its training management system.  In 2014 the Digital Training Management System (DTMS) received a major upgrade that turned it into a thoroughly embarrassing debacle within the Army.

Once released the new website lasted for just a few hours before software issues caused the site to go down for maintenance for several weeks.  Although released in October, problems were so bad that the Army withheld its press release touting the new features until January 12th of 2015.  Attention to detail was so low that when it did publish the article the press release stated it was published in 2014 because Mike Casey (the author) didn’t remember to change the date to 2015.  This lack of attention to detail during the project development lifecycle doesn’t begin or end with a delayed and miss dated press release.

While the aforementioned press release mentions how the program serves commanders in conducting training management it fails to identify which level of commander.  My personal opinion is that it serves commanders at BDE level and higher who would have a difficult time gathering training information on their more than 1000+ formations without the use of an automated tool such as DTMS.  

The command level that is the least served by the software is at the company level, the lowest level of command, where all of the required data entry occurs.  A company of approximately 100 individuals requires two full time personnel to manage the automated system.  Lower level stakeholders seem to have been neglected throughout the process.  

Other errors that affect lower level stakeholders include:

  • A non-intuitive interface requiring a full 40 hours of training before use
  • No back-button after saving an event requiring full navigation through the home screen to edit the next event.
  • Built on Microsoft Silverlight, a technology that forces the site to be run on older versions of Internet Explorer and one that has been abandoned by its creator, Microsoft
  • Limited resources to address issues found through feedback (some recommendations are years old with no resolution)
  • Unable to upload documents en masse (feature is listed as an option and fails upon execution)
  • Unable to make adjustments to the personnel assigned to the unit causing miscalculations of averages and aggregate data by including individuals no longer with the unit or misassigned
  • Exports to poorly designed formats
  • Exports UserID from database in Excel but hides the column with the UserID
  • Website susceptible to URL code injection
  • Higher echelons have more control over the data but are least familiar with it making it easy for them to misalign personnel and accidently delete crucial records with no easy method of restoration (no undo button) causing repeated efforts at lower levels to repair the mistake.

    A bad system is a good thing to learn from.  These issues are indicators of a project management team that failed to assess the project’s complexity and overlooked key stakeholders.  My role in the project was at the lowest command level where we were told to utilize the new system only to watch it go down for several weeks due to implementation issues.  Since then we’ve made efforts to assert ourselves as stakeholders using the appropriate feedback mechanisms only to have our perspective marginalized in the process.

    It’s a bit easier to see how the DoD can’t account for $8,500,000,000,000 when they fail to implement good project management practices while updating their training management system.

When The Database Costs

Our organization focuses heavily on training and so our primary database is designed around training management.  The greatest resource for training is time.  When things are managed properly more time is available to conduct more training.  When things aren’t managed properly then less time is available for training, making the organization less effective when called upon to perform.

The training database is centrally controlled, closed sourced and made by the lowest bidder making it difficult for an end user or lower level supervisor to conduct any analysis on its structure or make recommendations for improvement.

The database is accessed through a graphical user interface via a web browser that was designed around the workflow of the programming engineers and not the end user.  The site was coded to use modern HTML standards, but our work hasn’t authorized upgrading beyond Internet Explorer 9 making the site slow and its performance choppy.  But while the HTML coding may be modern the other technology its build upon has been abandoned by today’s most popular browsers because its simply out dated.  Yes, streamlining the acquisitions process is important.  

The site grants access to leadership six levels up and grants larger administrative rights to those in higher levels of leadership than those who are most familiar with the data.  This has resulted in frequent data loss as someone unfamiliar with the information will accidently delete large amounts of data requiring individualized manual re entry through the aforementioned clunky GUI.

The whole experience has convinced several members of the organization that databases are not cost effective and to continue to rely heavily on spreadsheets and printed documents.  There is no good way to customize a summary of information using the database, whereas a spreadsheet with a few formulas and conditional formating can quickly communicate areas of emphasis and allow managers to correct issues at the lowest level.

This database technology is responsible for costing the organization a great deal of time in retraining and creating unrewarding jobs of redundant data entry using a frustrating graphical user interface.

In a recent interview Nate Silver commented on how it took human skillsets a while to catch up to the capabilities of technology and really be able to capitalize on their full value.  Just because someone can crunch the data doesn’t mean they can understand it.  This isn’t a problem for my organization since we’re still learning principles of data entry and preservation.  

Because of this learned incompetence we also don’t have to worry about something else Nate Silver talks about, big data and false correlations:

“I think one of the false promises that was made early on is that, well if you have a billion data points or a trillion data points, you’re going to find lots and lots of correlations through brute force. And you will, but the problem is that a high percentage of those, maybe the vast majority, are false correlations, are false positives. Where there could be significance, but you have so many lottery tickets when you can run an analysis on a trillion data points, that you’re going to have some one in million coincidences just by chance alone. If you bet all your money on them, you might wind up looking very foolish in the end.”

Thankfully I don’t work for an organization that will someday look foolish because of false conclusions.  I work for an organization that is struggling with data entry.

The role database technology should play within the organization is efficiently capturing, storing, and making accessible the data that will allow the organization to move through the Data-Information-Knowledge model.

In 1980 economist Thomas Sowell began his book Knowledge and Decisions with the statement that “Ideas are everywhere, but knowledge is rare.”  Throughout most of the book Sowell explains that decision makers can only make decisions based upon the information they have available to them at the time they make a decision.  In 1980 the ideas outweighed the information available to use them.  Today the same may still be true.  Knowledge may still be rare, but the data needed to create knowledge is certainly more prevalent now than at any point in human history.

Someday I hope to work in an organization where the data is compiled for human consumption allowing the Data-Information-Knowledge model to work effectively.