Category Archives: IT Strategy

Thoughts on Information Security

I cannot stress enough the importance of information security. Almost everyday we hear stories about security breaches – hacker groups defacing websites for political purposes, countries stealing proprietary information from other countries and companies, organized crime stealing credit card information and selling those in the black market.

Cloud computing and mobile devices have exacerbated the problem.

The thing with security is that it is at odds with convenience. We want to get things done quickly, but security slows us down. For instance, we are required to enter hard to guess passwords to access our bank account online or access our company’s applications. Why not just let us in right away? Remembering passwords (and lots of them) and being required to change them every three months take some time and effort.

But if we want ourselves and our companies we work for to be secure, we should give up a little convenience. There is no other way.

A lot of technical solutions and innovations have been devised to improve information security. But no amount of technical innovation can solve the weakest link in security – social engineering. Remember the “I Love You” virus several years ago? It was a virus that was spread when you open an email with the subject line “I Love You.” Who wouldn’t want to open an email with that subject line?

User awareness is the key. Companies and individuals should at least invest in training on security and privacy.

The sad thing is that many companies and individuals do not take security very seriously, until they become victims. True, we should not spend significant amount of time and money for security. The resources we spend on security should be proportional to the assets we are protecting. You should not buy a 1 million dollar vault to protect your 100K painting.

When I obtained my CISSP certification several years ago, I didn’t plan on specializing on information security. I have, however, incorporated good security practices in system and network design and implementation, virtualization, storage, and almost all aspect of IT. But with the tremendous need for IT security professionals these days, I might consider specializing in information security.

Book Review: The Big Switch – Rewiring the World from Edison to Google

The Big Switch: Rewiring the World from Edison to Google. Nicholas Carr. New York: W. W. Norton and Company, 2008. 278 pp.

The future of computing, the book argues, is utility computing. Information Technology (IT) will reside “in the cloud” in a centralized fashion, and will be controlled by a few service providers who have built massive data centers. Just like electricity, IT will be delivered as a service to home users and to small and big companies. The IT departments of these companies may become irrelevant. There will be no need for them because “individuals and business units will be able to control the processing of information directly.”

High bandwidth availability makes utility computing possible. Soon, companies will outsource all of their IT functions from storage to applications to programming, to service providers. As a service provider, Google has started this trend with their Google Apps. Similarly, Amazon has offered software and hardware as a service. For instance, if a company needs an application, all they have to do is tell one of these service providers and the application will be available in no time. They don’t have to go through the hassle of procuring equipment, hiring programmers, and developing the application.

This next big thing has many names – cloud computing, utility computing, grid computing, and software/hardware as a service (SAAS) – but the book called it the World Wide Computer.

The premise of the switch from internal IT to the World Wide Computer is that too many resources are wasted on IT – labor, hardware, software, redundant systems, and overbuilt IT assets. The book contends that IT costs too much for what it delivers. There is just an excess in servers and computing capacity. Ultimately, it’s not the technology but the economics of it that will prevail. The cloud will make efficient use of IT resources.

Because everything is wired, physical location will not matter anymore. The same is true with software licensing. The model will be much like the electricity – the client pays for usage, not the costly software license that have made companies like Microsoft very rich. The new model, the book argues is very much like the Google Apps model. Users will be empowered when tapping the World Wide Computer – the possibilities are endless with its infinite information and computing power.

For people who have been following the computing revolution, Carr’s concept of utility computing is old news. IBM and other IT visionaries have been talking about utility computing for years. However, his book has successfully articulated the concept by drawing the parallelism of the evolution of electrification and the evolution of computing.

The history of electrification was well researched from the first waterwheels to windmills to the current centralized power generators. Similarly, the history of computing was well researched too, from Hollerith’s machine to IBM mainframe to personal computing, to client-server computing, and web computing. Along the way, Carr infused the business and economic forces that shaped their current form. He likewise talked about the social impacts of these – how it has changed societies and consequently changed people’s lives for the better. He discussed in great length the economic and social impact of the World Wide Computer – how the world will become more increasingly multi-polar instead of being united, the weaknesses of free flowing information, and the loss of human privacy.

Inasmuch as I agree with Carr’s position of utility computing, I do not believe that everything will go to the “cloud”. In my opinion, the future will be hybrid computing. There is so much computing power in every personal computer, laptop and mobile device that not utilizing them is a waste.
The IT department of large corporations will not disappear. The book missed the point that for some companies, the IT system is strategic, and they cannot simply outsource all of their IT functions. For instance, financial companies rely heavily on their IT system. Take it away from the stock market, for example, and trading will halt. The point is that: IT has varying degrees of importance for each company. But for electricity, there is none. Everybody needs electricity since it’s a commodity and can easily be sourced from other sources (such as using internal generators). IT cannot simply be commoditized – companies need specialized applications.

Another issue is data security and privacy. In the cloud, we don’t know where the data is stored. Intellectual property and company knowledge are just too important for the company to be hosted somewhere where security and privacy laws are not well defined. Unless there is a global law on data security and privacy, companies will hesitate to put their precious information in the cloud.

Finally, there is the law of unintended consequences. We cannot simply have a complete picture of the future. It is ironic for instance that because of the current concern for the environment, companies and homes alike may be generating their own power using solar, windmill or other means, thus decentralizing the electricity generation once again. The use of electrification as a metaphor for the World Wide Computer may not be accurate after all.

Backup Infrastructure

I have been designing, installing , and operating backup systems for the past several years.  I have mostly implemented and managed Symantec Netbackup (used to be Veritas Netbackup) for larger infrastructures and Symantec Backup Exec for smaller ones.

These software worked very well although some features are not very robust.  I’m very impressed for instance of the NDMP implementation in Netbackup.  Backing up terabytes of NetApp data via NDMP works very well.  However, I do not like the admin user interface of Netbackup since its not very intuitive. Their bare metal restore (BMR) implementation also is a pain.  Some of the bugs took years to fix.  Maybe because there are not too many companies using BMR.

Backup Exec works very well with small to medium systems. It has very intuitive interface, it is relatively easy to setup, and it has very good troubleshooting tools.  Lately though, Symantec has been playing catch up in their support for newer technologies such as VMware. It is so much easier to use Veeam to manage backup and restore of virtual machines.  In addition, Backup Exec has been breaking lately. Recent Microsoft patches have caused backup of System_State to hang.

But I think the biggest threat to these backup software are online backup providers. Crashplan, for instance, was initially developed for desktop backup, but it will not take long before companies will use it to back up their servers. When security concerns are addressed properly by these providers, companies will be more compelled to backup their data online. It’s just cheaper and easier to backup online.

NetApp Storage Migration Lessons

One of my latest projects is to consolidate six old NetApp Filers and migrate a total of 30 TB of data to a new NetApp Filer cluster, FAS 3240C. The project started several months ago and it is almost complete. Only one out of six NetApp filers is left to migrate.

I have done several storage migrations in the past, and there are always new lessons to learn in terms of the technology, migration strategy and processes, and the people involved in the project. Here are some of the lessons that I learned:

  1. As expected, innovations in computer technology move too fast and storage technology is one of them. IT professionals need to keep pace or our skills become irrelevant. I learned storage virtualization, NetApp fast cache, and snapmirror using smtape, among many other new features.
  2. Migration strategy, planning, and preparation take more time than the actual migration itself. For instance, one filer only took an hour and a half to migrate. However, the preparations such as snapmirroring, re-creating NFS and CIFS shares, making changes in users login scripts, making changes in several applications, and many other pre-work were done several days before the actual migration. The actual migration is actually just to catch up with the latest changes in the files (ie snapmirror update), and flipping the switch.
  3. People, like many other big IT projects, are always the challenging part. The key is to engage the stakeholders (business users, application owners, technical folks) early on in the project. Communicate with them the changes that are happening and how their applications and accesses to their data will be affected. Give them time to understand and make changes to their applications. Tell them the benefits of the new technology and communicate often the status of the project.