Thoughts From a Global Technology Leadership Forum

I recently had the privilege to attend and participate in a global technology leadership forum.  The forum consisted of technology investors, vendors and thought leaders and was an excellent event.  The tracks I focused on were VDI, Big Data, Data Center Infrastructure, Data Center Networks, Cloud and Collaboration.  The following are my notes from the event:


There was a lot of discussion around VDI and a track dedicated to it.  The overall feeling was that VDI has not lived up to its hype over the last few years, and while it continues to grow market share it never reaches the predicted numbers, or hits the bubble that is predicted for it.  For the most part the technical experts agreed on the following:

  • VDI has had several hang-ups both technical, cost and image wise that have held it back from mass-scale adoption
  • The technical challenges have been solved for the most part, storage solutions like cache, tiering and SSD can solve the IOPS contention and help to reduce the costs.  Storage optimization products like Atlantis Computing also exist to alleviate costs per seat by reducing storage requirements to obtain acceptable IOPS.
  • The cost model is getting better but is still not at a place where VDI is a no-brainer.  The consensus was that until a complete VDI solution can be rolled out for a cost per seat equal or lower to a typical enterprise desktop/laptop it will still be a tough decision.  Currently VDI is still a soft cost ROI as in it provides added features and benefit at a slightly higher cost.

There was some disagreement on whether VDI is the right next step for the enterprise.  The split I saw was nearly 50/50 with half thinking it is the way forward and will be deployed in greater and greater scale, and the other half thinking it is one of many viable current solutions and may not be the right 3-5 year goal.  I’ve expressed my thoughts previously: Lastly we agreed that the key leaders in this space are still VMware and Citrix.  While each have pros and cons it was believed that both solutions are close enough as to be viable and that VMware’s market share and muscle make it very possible to pull into a dominant lead.  Other players in this space were complete afterthoughts.

Big Data:

Let me start by saying I know nothing about big data.  I sat in these expert sessions to understand more about it, and they were quite interesting.  Big data sets are being built, stored, and analyzed.  Customer data, click traffic, etc. are being housed to gather all types of information and insight.  Hadoop clusters are being used for processing data, cloud storage such as Amazon S3 is being utilized as well as on-premises solutions.  The main questions were in regard to where the data should be stored and where it should be processed, as well as the compliance issues that may arise with both.  Another interesting question was the ability to leave the public cloud if your startup turns big enough to beat the costs of public cloud with a private one.  For example if you have a lot of data you can mail Amazon disks to get it into S3 faster than WAN speed, but to our knowledge they can’t/won’t mail your disk back if you want to leave.

Data Center Infrastructure:

Overall there was an agreement that very few data center infrastructure (defined here as compute, network, storage) conversations occur without chat about cloud.  Cloud is a consideration for IT leaders from the SMB to large global enterprise.  That being said while cloud may frame the discussion the majority of current purchases are still focused on consolidation and virtualization, with some automation sprinkled in.  Private-cloud stacks from the major vendors also come into play helping to accelerate the journey, but many are still not true private clouds (see:

Data Center Networks:

I moderated a session on flattening the data center networks, this is currently referred to as building ‘fabrics.’  The majority of the large network players have announced or are shipping ‘fabric’ solutions.  These solutions build multiple active paths at Layer 2 alleviating the blocked links traditional Spanning-Tree requires.  This is necessary as we converge our data and ask more of our networks.  The panel agreed that these tools are necessary but that standards are required to push this forward and avoid vendor lock-in.  As an industry we don’t want to downgrade our vendor independence to move to a Fabric concept.  That being said most agree that pre-standard proprietary deployments are acceptable as long as the vendor is committed to the standard and the hardware is intended to be standards compliant.


One of the main discussions conversations I had was in regards to PaaS.  While many agree that PaaS and SaaS are the end goals of public and private clouds, the PaaS market is not yet fully mature (see:  Compatibility, interoperability and lock-in were major concerns overall for PaaS.  Additionally while there are many PaaS leaders, the market is so immature leadership could change at any time, making it hard to pick which horse to back. 

Another big topic was open and open source.  Open Stack, Open Flow and open source players like RedHat.  With RedHat’s impressive YoY growth they are tough to ignore and there is a lot of push for open source solutions as we move to larger and larger cloud systems.  The feeling is that larger and more technically adept IT shops will be looking to these solutions first when building private clouds.


Yet another subject I’m not an expert on but wanted to learn more about.  The first part of the discussion entailed deciding what we were discussing i.e. ‘What is collaboration.’  With the term collaboration encompassing: voice, video, IM, conferencing, messaging, social media, etc. depending on who you talk to this was needed.  We settled into a focus on enterprise productivity tools, messaging, information repositories, etc.  The overall feeling was that there are more questions than answers in this space.  Great tools exist but there is no clear leaders.  Additionally integration between enterprise tools and public tools was a topic and involved the idea of ensuring compliance.  One of the major discussions was building internal adoption and maintaining momentum.  The concern with a collaboration tool rollout is the initial boom of interest followed by a lull and eventual death of the tool as users get bored with the novelty before finding any ‘stickiness.’

GD Star Rating

Passwords Are Doomed: You NEED Two-Factor Authentication

How many people use eight-character or less passwords with the first letter being capital and last entries being numbers? People are predictable and so are their passwords. To make things worse, people are lazy and tend to use the same passwords for just about everything that requires one. A study from the DEFCON hacker conference stated, “with $3,000 dollars and 10 days, we can find your password. If the dollar amount is increased, the time can be reduced further”. This means regardless of how clever you think your password is, its eventually going to be crack-able as computers get faster utilizing brute force algorithms mixed with human probability. Next year the same researchers may state, “with 30 dollars and 10 seconds, we can have your password”. Time is against you.

Increasing password sizes and changing mandatory character types helps combat this threat however humans naturally will utilize predictable practices as passwords become difficult to remember. It’s better to separate authentication keys into different factors so attackers must compromise multiple targets to gain access. This dramatically improves security but doesn’t make it bullet proof as seen with RSA tokens being compromised by Chinese hackers. Ways to separate keys are leveraging something you know, have and are. The most common two-factor solutions are something you have and know which is a combination of a known password/pin and having a token, CAC/PIV card or digital certificate. Biometrics is becoming more popular as the cost for the technology becomes affordable.

There are tons of vendors in the authentication market. Axway and Active Identity focus on something you have offering CAC/PIV card solutions. These can be integrated with door readers to provide access control to buildings along with two-factor access to data. RSA and Symantec focus on hardware or software certificate/token based solutions. These can be physical key chains or software on smartphones and laptops that generate a unique digit security code every 30 seconds. Symantec acquired the leader of the cloud space VeriSign, which offers recognizable images, challenge and response type solutions. Symantec took the acquisition further by changing their company logo to match the VeriSign “Check” based on its reputation for cloud security.




The consumer market is starting to offer two-factor options to their customers. Cloud services such as Google and Facebook contain tons of personal information and now offer optional two-factor authentication. Its common practice for financial agencies to use combinations of challenge and response questions, known images and verifying downloadable certificates used to verify machines to accounts. The commercial trend is moving in the right direction however common practice for average users is leveraging predictable passwords. As many security experts have stated, security is as strong as the weakest link. Weak authentication will continue to be a target as hackers utilizing advance computing to overcome passwords.

More security concepts can be found at

GD Star Rating