Author Archives: admin

Networking Lessons

I’m not talking about computer networking. I’m talking about networking with people at events (such as social events, seminars, and conferences) to increase your contacts and build meaningful relationships. You’ll never know if these people could turn out to be your future employer, your business partner, or even just your friend.

I’m not saying I’m an expert in networking. Far from it. However, these are the lessons I’ve learned from attending numerous networking events.

First and foremost, I make sure this is an event that I really want to attend. I get invited to a lot of networking events, since I belong to different clubs – Toastmasters clubs, Harvard Alumni clubs, etc. In addition, I get invited to a lot of IT related events such as security conferences, trade shows, and vendor seminars. I ask myself the following questions before I sign up:

1. Will it add value to me?
2. Will I make new / meaningful connections?
3. Is it worth my time and money?

Once I determined that I am going to the event, I prepare the night before the event. I polish my elevator speech, I make sure I have enough business card, and if I have access to the list of attendees, I plan on the people I’d like to meet. I also prepare questions I’d like to ask. Some of the questions I ask to break the ice are the following:

1. How do you know the host?
2. What do you do for fun?
3. Where are you from? What do you do?
4. Compliment anything – appearance, health, clothing (eg. Wow, that’s a nice…? Where did you get it?)

During the event, I make sure to talk to people and be the first one to say hello. I admit this takes a lot of effort for me since I am an introvert. But if I don’t initiate the conversation, nobody will. I ask a lot of questions and offer help within my capacity. Remember, networking is a two way street. It’s not only about what you can get, but what you can do to help the other person.

If the event has a speaker, I try to ask questions and participate at sessions.

I also make sure that I meet at least 3 new people I can connect with. I usually ask to connect on LinkedIn, since it is the best way to keep in touch.

Finally, I try to have fun and enjoy the event.

Internal Web Analytics

There are a lot of tools out there that can analyze web traffic for your site. Leading the pack is Google Analytics. But what if you want statistics of your internal website, and you don’t necessarily want to send this information to an external provider such as Google? Here comes Piwik.  Piwik is very much like Google Analytics but can be installed on your internal network. The best part is that it’s free.

Since Piwik is a downloadable tool, you need to have a machine running web server and mysql. You can install it on your existing web server or on a separate web server. I installed it on a separate CentOS machine. I found the installation very easy. In fact, you just unzip a file and put those files in a web directory. The rest of the installation is via the browser. If there is a tool missing on your server, (in my case, I need the PDO extension) it will tell you how to install it. Pretty neat.

After installing the server, you just need to put a small javascript code on the pages you want to track. That’s it. Piwik will start gathering statistics for your site.

I also evaluated Splunk and it’s companion app – Splunk App for Web Intelligence, but I found that it is not ready for prime time. There are still bugs. No wonder it is still in beta. When I was evaluating, it wasn’t even able to get usable information from apache logs.

I’ve been using Awstats to extract statistics for internal websites for years. It has been very reliable but sometimes it provides inaccurate results. The open source Piwik web analytic tool provides accurate statistics and is the best tool I’ve used so far.

Focus on Existing Clients

I’ve been working as a part time consultant for small and start-up companies in Cambridge, MA. These clients ask me to design and build their IT infrastructure. Most of the time, the infrastructure is built in-house, and sometimes they are put into the “clouds”. It largely depends on which architecture make sense for the clients. For instance, some clients generate huge amount of data in-house, so it make sense to build the storage infrastructure inside their premise.

Once the infrastructure is built though, most will be in operations mode. This mode does not require huge amount of time — specially in small companies. You only get called when there are problems. Should you then look for new clients, so you can generate more revenue? I believe it is easier to focus on existing customers and generate more work (and revenue) from them. In fact, if you focus more on looking for new clients, your relationship with existing ones erode, your service become stagnant, and in some cases you end up losing their business.

To focus more on existing clients, here are three proven methods to generate more revenue from them:

1. Provide timely responses. When something breaks, fix it right away. If you cannot do it in the next hour, provide a feedback when you can work on it and the estimated completion time. Improve your customer service skills and communicate often.

2. Address unmet needs. There will always be unmet needs in the Information Technology space. For instance, the client may not know that due to regulation, data containing any personal information of employees and customers such as credit card numbers, social security numbers, etc. should be encrypted. Offer to create a project for this unmet need.

3. Offer value added services. For instance, offer a comprehensive Disaster Recovery Plan. Tell the client that a simple backup infrastructure is not enough for the business to continue to operate after a major disaster.

It’s hard and expensive to find new clients. Your existing clients will be happier (and will pay you more money) if you focus on them.

Mt. Wachusett: Conquered

Yesterday, Sunday, July 15, 2012, my friend Ferdie and I cycled the 60-mile hilly road bike course by the Charles River Wheelmen “Climb to the Clouds” bike tour.

Together with approximately 850 other cyclists, we climbed the gruelling 1-mile 9% grade climb to Mt. Wachusett. As an amateur rider, I felt the pain on my legs and back during the climb, but it was well worth it. The feeling of satisfaction when we reached the top was incredible.

I’ve been joining bike tours for the past couple of years, including the fun Five Borough Bike Ride in New York City,  and the Maine Lighthouse Ride in South Portland, Maine.  But the “Climb to the Clouds” bike tour is the longest and most difficult tour I’ve joined so far.

My goal is to ride a century course (100 miles or more) in the next couple of years. I’m looking at the Pan Mass Challenge, or the Harpoon Brewery2Brewery 150-mile ride from Boston to Vermont, as my next goal. It will be a tough ride. But just like anything else in life, if you want to reach your goal, you have to work hard for it.

Some photos: Climb to the Clouds, Maine Lighthouse Ride, Bike New York

Security Strategy

Amidst the highly publicized security breaches, such as the LinkedIn hacked passwords, hacktivists defacing high profile websites, or online thieves stealing credit card information, one of the under-reported security breaches are nation states or unknown groups stealing Intellectual Property information from companies such as building designs, manufacturing secret formulas, business processes, financial information, etc. This could be the most damaging security breach in terms of its effect on the economy.

Companies do not even know they are being hacked, or are reluctant to report such breaches. And the sad truth is that companies do not even bother beefing up their security until they become victims.

In this day and age, all companies should have a comprehensive security program to protect their assets. It starts with an excellent security strategy, a user awareness program (a lot of security breaches are done via social engineering), and a sound technical solution. A multi-layered security is always the best defense – a firewall that monitors traffic, blocks IP addresses that launches attacks, and limits the network point of entry; an IDS/IPS that identifies attacks and gives signal; a good Security Information and Event Management (SIEM) system; and good patch management system to patch servers and applications immediately once vulnerabilities are identified, to name a few.

Cost is always the deciding factor in implementing technologies. Due diligence is needed in creating cost analysis and threat model. As with any security implementation, you do not buy a security solution that costs more than the system you are protecting.

Harvard Club of Worcester

On June 7, 2012, the Harvard-Radcliffe Club of Worcester held its 106th annual meeting and dinner at the Beechwood Hotel. The event was well attended and was very successful. The keynote speaker, Frederick Eppinger, CEO of Hanover Insurance, gave a very interesting speech on the current improvements and the future plans for the City of Worcester.

During the meeting, I was asked to serve as the secretary of the club and I gladly accepted the role. In fact, I am very excited to serve along with an excellent company of officers. I look forward to very fun-filled and successful events.

I have been an active member of the club for the past couple of years – joining sporting events to cheer on the Harvard Crimson football, basketball, and hockey teams; serving dinner to kids at the Worcester Boys and Girls Club during the Harvard Community Day of service, and many others. News and pictures of the past events can be found here.

If you are an Harvard alum living in the Worcester or Central Massachusetts area, and want to get involved or network with fellow alums, please join our club events  or contact any one of the officers.

Disaster Recovery using NetApp Protection Manager

In our effort to reduce tape media for backup, we have relied on disks for our backup and disaster recovery solution. Disks are getting cheaper and de-duplication technology keeps on improving. We still use tapes for archiving purposes.

One very useful tool for managing our backup and disaster recovery infrastructure is NetApp Protection Manager. It has replaced the management of local snapshots, snapmirror to Disaster Recovery (DR) site, and snapvault. In fact, it doesn’t use these terms anymore. Instead of “snapshot,” it uses “backup.” Instead of “snapmirror,” it uses the phrase “backup to disaster recovery secondary.” Instead of “snapvault,” it uses “DR backup or secondary backup.”

NetApp Protection Manager is policy-based (e.g. backup primary data every day @ 6pm, and retain backups for 12 weeks; backup primary data to DR site every day @ 12am; backup the secondary data every day @ 8am and retain for 1 year). As an administrator, one does not have to deal with the nitty-gritty technical details of snapshots, snapmirror, and snapvault.

There is a learning curve in understanding and using Protection Manager. I have been managing NetApp storage for several years and I am more familiar with snapshots, snapmirror, and snapvault. But as soon as I understood the philosophy behind the tool, it gets easier to use it. NetApp is positioning it for the cloud. The tool also has dashboards intended for managers and executives.

Vmware Datastore via NFS

One of the objectives of our recently concluded massive storage upgrade project, was to replace our vmware datastore from iSCSI to NFS. I have been hearing the advantages of using NFS versus block-level storage (ie, iSCSI or Fiber Channel), and true enough NFS did not disappoint.

We have been using iSCSI on NetApp as datastore on vmware for a long time, and it has been running pretty well. But when we perform maintenance on the NetApp storage, the virtual machines were often times affected. In addition, restore procedures can be a pain.

While Fiber Channel (FC) is still the standard storage for most vmware implementation because of its proven technology, in my experience here are the advantages of using NFS over iSCSI or FC:

1. Robust, as long as you follow the best practices guidelines. For instance, separate the NFS network from the general use network. Vmware and NetApp released white papers discussing the NFS datastore best practices. In our environment, we did several failover on the NetApp storage to upgrade the Data ONTAP version, and the virtual machines were never affected.

2. Easier to configure both on the vmware side and the NetApp side.

3. Easier to backup, via NDMP on the NetApp side.

4. Easier to restore vmdk files using the snapshots on the NetApp side, since there is no need to mount LUNs.

5. Vmware and NetApp built great tools for seamless maintenance and operations.

Thoughts on Information Security

I cannot stress enough the importance of information security. Almost everyday we hear stories about security breaches – hacker groups defacing websites for political purposes, countries stealing proprietary information from other countries and companies, organized crime stealing credit card information and selling those in the black market.

Cloud computing and mobile devices have exacerbated the problem.

The thing with security is that it is at odds with convenience. We want to get things done quickly, but security slows us down. For instance, we are required to enter hard to guess passwords to access our bank account online or access our company’s applications. Why not just let us in right away? Remembering passwords (and lots of them) and being required to change them every three months take some time and effort.

But if we want ourselves and our companies we work for to be secure, we should give up a little convenience. There is no other way.

A lot of technical solutions and innovations have been devised to improve information security. But no amount of technical innovation can solve the weakest link in security – social engineering. Remember the “I Love You” virus several years ago? It was a virus that was spread when you open an email with the subject line “I Love You.” Who wouldn’t want to open an email with that subject line?

User awareness is the key. Companies and individuals should at least invest in training on security and privacy.

The sad thing is that many companies and individuals do not take security very seriously, until they become victims. True, we should not spend significant amount of time and money for security. The resources we spend on security should be proportional to the assets we are protecting. You should not buy a 1 million dollar vault to protect your 100K painting.

When I obtained my CISSP certification several years ago, I didn’t plan on specializing on information security. I have, however, incorporated good security practices in system and network design and implementation, virtualization, storage, and almost all aspect of IT. But with the tremendous need for IT security professionals these days, I might consider specializing in information security.

Book Review: The Big Switch – Rewiring the World from Edison to Google

The Big Switch: Rewiring the World from Edison to Google. Nicholas Carr. New York: W. W. Norton and Company, 2008. 278 pp.

The future of computing, the book argues, is utility computing. Information Technology (IT) will reside “in the cloud” in a centralized fashion, and will be controlled by a few service providers who have built massive data centers. Just like electricity, IT will be delivered as a service to home users and to small and big companies. The IT departments of these companies may become irrelevant. There will be no need for them because “individuals and business units will be able to control the processing of information directly.”

High bandwidth availability makes utility computing possible. Soon, companies will outsource all of their IT functions from storage to applications to programming, to service providers. As a service provider, Google has started this trend with their Google Apps. Similarly, Amazon has offered software and hardware as a service. For instance, if a company needs an application, all they have to do is tell one of these service providers and the application will be available in no time. They don’t have to go through the hassle of procuring equipment, hiring programmers, and developing the application.

This next big thing has many names – cloud computing, utility computing, grid computing, and software/hardware as a service (SAAS) – but the book called it the World Wide Computer.

The premise of the switch from internal IT to the World Wide Computer is that too many resources are wasted on IT – labor, hardware, software, redundant systems, and overbuilt IT assets. The book contends that IT costs too much for what it delivers. There is just an excess in servers and computing capacity. Ultimately, it’s not the technology but the economics of it that will prevail. The cloud will make efficient use of IT resources.

Because everything is wired, physical location will not matter anymore. The same is true with software licensing. The model will be much like the electricity – the client pays for usage, not the costly software license that have made companies like Microsoft very rich. The new model, the book argues is very much like the Google Apps model. Users will be empowered when tapping the World Wide Computer – the possibilities are endless with its infinite information and computing power.

For people who have been following the computing revolution, Carr’s concept of utility computing is old news. IBM and other IT visionaries have been talking about utility computing for years. However, his book has successfully articulated the concept by drawing the parallelism of the evolution of electrification and the evolution of computing.

The history of electrification was well researched from the first waterwheels to windmills to the current centralized power generators. Similarly, the history of computing was well researched too, from Hollerith’s machine to IBM mainframe to personal computing, to client-server computing, and web computing. Along the way, Carr infused the business and economic forces that shaped their current form. He likewise talked about the social impacts of these – how it has changed societies and consequently changed people’s lives for the better. He discussed in great length the economic and social impact of the World Wide Computer – how the world will become more increasingly multi-polar instead of being united, the weaknesses of free flowing information, and the loss of human privacy.

Inasmuch as I agree with Carr’s position of utility computing, I do not believe that everything will go to the “cloud”. In my opinion, the future will be hybrid computing. There is so much computing power in every personal computer, laptop and mobile device that not utilizing them is a waste.
The IT department of large corporations will not disappear. The book missed the point that for some companies, the IT system is strategic, and they cannot simply outsource all of their IT functions. For instance, financial companies rely heavily on their IT system. Take it away from the stock market, for example, and trading will halt. The point is that: IT has varying degrees of importance for each company. But for electricity, there is none. Everybody needs electricity since it’s a commodity and can easily be sourced from other sources (such as using internal generators). IT cannot simply be commoditized – companies need specialized applications.

Another issue is data security and privacy. In the cloud, we don’t know where the data is stored. Intellectual property and company knowledge are just too important for the company to be hosted somewhere where security and privacy laws are not well defined. Unless there is a global law on data security and privacy, companies will hesitate to put their precious information in the cloud.

Finally, there is the law of unintended consequences. We cannot simply have a complete picture of the future. It is ironic for instance that because of the current concern for the environment, companies and homes alike may be generating their own power using solar, windmill or other means, thus decentralizing the electricity generation once again. The use of electrification as a metaphor for the World Wide Computer may not be accurate after all.