Identity & Access Management

2013 and 2014 were major years for IAM awareness in both government and industry.  The Snowden leaks helped teach government agencies the value of limiting individual access to vast troves of information.  In the private sector space Target’s credit card breach cost the company 46% of its fourth quarter profits and litigation for more than 140 lawsuits (Radichel, 2014).  Although Target’s breach might have been stopped by any number of mitigation efforts, proper IAM would have limited the ability for the intruders to spread from the billing system used by the HVAC company to the more sensitive parts of the network.

The attention received from the breaches has resulted in an increased level of attention on the topic from all levels.  Some parts of the industry such as Staminus Security and NorseCop have responded with security theater (Gallagher, 2016 & Fisher, 2016) while other parts of the industry have taken a thoughtful look at making sure only the right people have access to the right amount of information. 

For local IT IAM often takes the form of a Microsoft Active Directory server or some LDAP variant.  LDAP v3 is described in RFC 4511 released in 2006 and includes a number of key features, namely bind, unbind, unsolicited notification, search operation, modify operation, add operation, delete operation, modify DN operations, compare operation, abandon operation, extend operation, intermediate response message and start TLS operation.  In general these commands are initiated through TCP or UDP port 389.

While RFC 4511 has served the industry well for creating functioning authentication protocols in the nearly full decade since its release has seen a great deal of growth and development.  In December of 2015 the VP of Technology for Advancer Corporation penned his IAM predictions for 2016 giving us an indication of how far the field has developed.  His seven predictions include:


  1. Cyber security has become the religion, equally for government and businesses.

  2. Cloud IAM to spread towards provisioning capabilities.

  3. Spreading of IDM systems on on-premise as well as cloud.

  4. Cloud will enable greater utilisation of IAM products by small enterprises.

  5. Safeguarding and securing super users through PAM.

  6. Managing of identity through secure user identity management and access governance will enhance.

  7. Businesses will stay agile by adding more layers of IAM into their IT infrastructure.

(Mittal, 2015)

In addition to SaaS, PaaS and IaaS companies to include Centrify are now talking about Identity as a Service (IDaaS).

All of these technologies are extensions of the need to be authenticated within cyberspace.  For individual users sites such as LastPass step in to help them manage their online identity across a myriad of websites.  SSH, bitcoin and bitmessage all operate using cryptographic keys to ensure sender and recipient identification during transmission. 

For the average user this effort really hits home in the area of social media.  As of 2011 facebook began forcing https connections to reduce the man-in-the-middle attack vector (Stackoverflow, 2011).  Google also adopted https in 2011 to reduce snooping on user search queries (, 2011).  The robustness and popularity of social media caused Gartner’s research team to predict in 2013 that future customer identities would be based on social media (Gartner Inc., 2013).  Today the spirit of that prediction holds true as social media sites are integrated into sharing economy sites such as AirBnB and educational sites such as Khan Academy.  Google’s developer websites now include easy to follow guides for leveraging their identification services into emerging technology (, 2016).

In mobile computing just like traditional machines identification management begins with authenticating on the device itself.  Fingerprint readers are now serious features on smartphones.  Although phones do have inherently insecure networking components (Anthony, 2013) they do enable a second layer of IAM, two factor authentication (2FA).  2FA on smartphones works because the phone itself is a part of two separate networks.  The SMS messaging service built around the purely cellular technology is in many ways a separate network from the data connection on the phone itself.  Because of this an attempted login over https can be verified using an SMS message.  Circle finance requires authentication before conducting bitcoin transactions and major social networks now offer 2FA as part of their authentication services.

In the cloud IAM has become a must have as cloud features have grown in popularity and potential.  Google for business accommodates several layers of cloud sharing options with respect to files hosted on Google Drive.  By default they are only accessible to the author.  The default for sharing is to have it shared across the entire organization.  Additional options exist for public read only, public edit and organization read only.  Because the system is cloud based it can respond quickly to new features suggested by user feedback.  Google’s products aren’t the only ones with these features.  Similar access control and identification measures are implemented into dropbox and owncloud and are considered a standard feature set when developing similar tools.

As we move more and more things to the cloud and big data becomes more of a reality for businesses IAM will continue to be a significant part of the organization’s IT strategy.  In the business world Sony’s 2014 breach attributed to an insider threat is a critical example of how big data matched with poor IAM can cause serious problems.  While 2013 and 2014 were significant years for IAM awareness today the industry has matured, but only time will tell of the pace of maturity across the spectrum has kept up with the pace of innovation from malicious actors.





Anthony, S. (2013, November 13). The secret second operating system that could make every mobile phone insecure | ExtremeTech. Retrieved April 12, 2016, from

Fisher, C., & Jude, A. (2016, February 4). Hot Norse Potato | TechSNAP 252 | Jupiter Broadcasting. Retrieved April 12, 2016, from

Gallagher, S. (2016, March 11). After an easy breach, hackers leave “TIPS WHEN RUNNING A SECURITY COMPANY”. Retrieved April 12, 2016, from

Gartner, Inc. (2013, February 5). Gartner Says Half of New Retail Customer Identities Will Be Based on Social Network Identities by 2015. Retrieved April 12, 2016, from (2011, October 18). Making search more secure. Retrieved April 12, 2016, from (2016, April 12). Google Identity Platform  |  Google Developers. Retrieved April 12, 2016, from

Mittal, R. (2015, December 18). IAM Tech Trends to watch out for in 2016. Retrieved April 12, 2016, from

Radichel, T. (2014, August 5). Case Study: Critical Controls that Could Have Prevented Target Breach. Retrieved March 29, 2016, from

Stackoverflow. (2011, January 27). Force HTTPS on Facebook? Retrieved April 12, 2016, from

Information Portability and Control

Information is the ultimate currency of any organization.  Protecting that information so the right message is released to the right audience at the right time is the responsibility of each individual in that organization.  In some sectors mishandling of information can lead to life threatening security concerns.  In most cases though information mishandling leads to a significant cost of time and money.  

Among the many things businesses can adopt to mitigate the risk of information spillage the first that I would recommend is to normalize encryption for all communication.  One of the major ways people communicate online is via email and there are several email encryption solutions available.  Most of the solutions are offered by private companies and are proprietary.  When selecting a proprietary system it’s important to evaluate how that system will accommodate future technology.  Currently the U.S. Army’s email encryption system requires a 32 bit version of Internet Explorer 9 in order to function which reduces users’ ability to leverage new technologies.

In many cases email is used to transfer files in addition to text messages.  Some conversations can be consolidated to a collaborative cloud based text editing system that allows multiple live editors such as Google Docs, Office 365, or Owncloud.  These solutions are more secure because they don’t require unencrypted transferring the information from one server.  The users can simply edit the documents using the secure (https) connections of their browsers.

If the business does need to transfer large files securely across multiple workstations BitTorrent Sync will allow easy and secure file sharing without any additional cost of organizational infrastructure.  Shared folders can be synchronized across a vast network with specific controls on who can read, write, and access the files.  I’ve used BitTorrent Sync to transfer several gigabyte movie files and large photo libraries across continents.

Businesses have to address the balancing act of sharing their information within the organization in a way that still allows them to maintain control and leverage that information for profits.  Handling this information is everyone’s responsibility and any system that gets implemented needs to be easy to use so it can have the widest range of adoption.

Data Mining & The Cognitive Hierarchy

Data Mining is somewhat of a misnomer term used to describe the discovery of patterns within a dataset as opposed to finding the actual data in a dataset.  “The types of patterns decision makers try to identify include associations, sequences, clustering, and trends” (Kendall, 2013).  One way to understand how this works it to look at how data plays a role in the cognitive hierarchy of an organization.

Although a concept pioneered prior to automation and primarily concerned with delivering knowledge to decision makers, the cognitive hierarchy process still has value in today’s big data discussions.  In the first layer data exists but isn’t processed until it moves to the next layer.  This initial processing can include basic database sorting and the application of metadata.

In the second layer information is analyzed and it’s this layer where data mining takes place.  Data mining provides the analysis that can be shared as knowledge amongst the stakeholders and decision makers to be able to make decisions.  The hierarchy concludes when the decision has been made and is transmitted to all the stakeholders.

Since this pyramid was designed prior to today’s automation it has generally fallen out of vogue.  I believe it remains a good reference point for organizations that focus heavily on delivering meaningful content to a decision maker.  

According to Kendall and Kendall, modern data mining emerged “from the desire to use a database for more selective targeting of customers.”  The decision techniques used for presenting this information to customers have also become automated and that automation necessitates further refinement.  In relation to the cognitive hierarchy, a customer is a decision maker required to pass judgement on the knowledge presented before him.  In the area of online advertising there are a lot of hits and misses in the decision techniques that present the ads to customers.  Let me share an example.

I listened to a podcast on my phone (local audio content downloaded without going through any analytical tools).  Because of the local nature of the content no metadata was created associated with the actual topic and my use.  I didn’t see any ads based upon the topic of the show.  One of the topics discussed was web hosting using DigitalOcean.  A few days later I went to their site using google chrome and signed up.  After that I started seeing ads for DigitalOcean on other websites.  

This is a hit because it recognized I am someone interested in web hosting, but it’s also a miss because it didn’t accurately identify that I had already purchased the service through the advertized company.  This meant the advertized company spent money on ads designed for acquiring new customers that were displayed on existing customers’ screens.  This inefficiency will translate to a lower ROI with the advertiser and if low enough can cause the advertised company to take their business elsewhere.  If the loss of revenue for the advertising company is lowered enough they will likely adjust their decision techniques built into the algorithms to ensure they are delivering the most competitive product possible for their customers.

The waste in this example is an obvious disadvantage to the decision techniques used in presenting the content.  This disadvantage is certainly reduced by the advantages of the system as a whole.  The automated aspect of the system comes at very low cost of setup to both the advertising and advertised companies and is built upon a system that is easy to use for both parties.

There is also the advantage of identifying the market of interested customers.  Although broadcast media content such as the non-profit NFL’s super bowl reaches a large audience of general users, that large audience has less appeal appeal for specialized products that have a narrower potential customer base.  Having the space to advertise for speciality products has been an accomplishment brought about through effective data mining.

A final advantage of the automated decision techniques used today is the speed of adjustment.  Although complex in nature the ability to massively update and apply improvements to the system is as easy as changing a few lines of code.  Advertising industries with higher overhead (printed fliers) aren’t able to respond as rapidly to changes.

Although somewhat out of vogue the cognitive hierarchy is still a valuable visual tool for understanding the processing and presentation of content relevant to decision makers even in the fast adjusting era of big data.