Category Archives: IT Management

IT Infrastructure Qualification and Compliance

One of the requirements of building and operating an IT infrastructure in a highly regulated industry (such as the pharmaceutical industry, which is regulated by the FDA) is to qualify, or validate the servers, network, and storage when they are being built. Once built, any changes to the infrastructure should undergo a change control procedure.

Building the infrastructure and making changes to it should undergo verification. They should also be documented so that they can be easily managed and traced. These activities are really not that different from the best practices guide in operating an IT infrastructure, or even from the ITIL processes.

FDA does not dictate how to perform IT infrastructure qualification or validation, as long as you have documented reasonable procedures.

The problem is that some companies overdo validation and change control processes. The common problems I’ve seen are: (1) too many signatures required to make a change, (2) no automated procedure to perform the documentation – many still use papers to route documents (3) and finally, the people who perform the checks and balances sometimes do not understand the technology.

The result is that IT personnel get overwhelmed with paperwork and bureaucracy. This discourages them to make critical changes to the infrastructure such as performing security patches on time. This also leads to the relunctance of IT personnel to implement newer or leading edge technologies into their infrastructure.

Fortunately, the International Society for Pharmaceutical Engineering (ISPE) has published a Good Automated Manufacturing Practice (GAMP) guidance on IT Infrastructure Control and Compliance. Companies can create their own IT infrastructure qualification program and procedures based on the GAMP guidance document. They should be simple but comprehensive enough to cover all the bases. It is also important that these procedures be periodically reviewed and streamlined to achieve an optimized procedure.

IT Infrastructure for Remote Offices

When designing the IT infrastructure (servers, storage, and network) of small remote offices, infrastructure architects of large enterprises are often faced with the question, what is the best IT infrastructure solution for remote sites? Low-cost, simple, secure, and easy to support solution always come to mind, but positive end-user experience in terms of network and application performance, and user friendliness should also be in the top priorities when building the infrastructure.

Most small sites just need access to enterprise applications and to file and print services. Network infrastructure definitely needs to be built – such as the site’s local area network (LAN), wireless access points, wide area network (WAN) to connect to the enterprise data center, and access to the Internet. The bigger question though is: should servers and storage be installed on the site?

There are a lot of technologies such as WAN accelerators and “sync and share” applications that will forgo installing servers and storage on the remote sites without sacrificing positive end-user experience. For instance, Riverbed WAN accelerator products tremendously improve performance access to files and applications from the remote sites to the enterprise data center.  These products can even serve up remote datastore for VMware farms. “Sync and share” applications are dropbox-like applications (such as EMC Syncplicity). Enterprises can build a storage as a service solution in their internal infrastructure. This will eliminate the need to install file servers or storage appliances on the remote sites.

The decision to “install servers” or “go serverless” at the remote sites still depends on many factors. They should be dealt with on a case by case basis and should not rely on a cookie cutter solution. Some of the criteria to consider are: the number of people at the sites and the growth projection; the storage size requirement, available WAN bandwidth, the presence or absence of local IT support, office politics, and country/region specific regulation for data to remain local. If these issues are factored in, a better solution can be designed for remote offices.

Big Data

There is too much hype on big data these days, promising the next big revolution in information technology which will change the way we do business. It purports to have a big impact on economy, science, and society at large. In fact, big data right now is at the “peak of inflated expectations” on the Gartner technology hype cycle.

Big data “refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it.” It answers questions that are sometimes not so obvious.

Big data definitely has tremendous potential. After all the hype has subsided, entities that do not take advantage of its power will be left out. In fact big data is already being used by technology companies such as Google, Amazon, Facebook, and many other companies. IT vendors such as Oracle, EMC, and IBM started offering big data solutions for companies and enterprises.

There are three drivers that is making big data possible:

First, a robust and cheap IT infrastructure – powerful server platforms that crunch data, advanced storage systems that store huge amount of data, and ubiquitous network – Wifi, 4G, fiber, etc.

Second, the explosion of data from mobile devices, social networks, web searches, sensors, and data from many different devices.

Lastly, the proliferation of powerful analytics and data mining tools suited for big data, such as Hadoop, MapReduce, NoSQL, and many other software yet to be created. These tools will only get better and better.

I recently read the book entitled “Big Data: A Revolution That Will Transform How We Live, Work, and Think” by Viktor Mayer-Schönberger and Kenneth Cukier.

The book is spot on its predictions. With big data, there will be yet another paradigm shift on how we understand the world. With big data, “what” is more important than “why”. Big data is also the processing of complete data, not just a sampling of data. It also means accepting less than perfect accurate result.

The book also talks about the dark side of big data – such as the loss of privacy. It also talks about how big data predictions can be used to police and punish individuals, and how organizations may blindly defer to what the data says without understanding its limitations.

I highly recommend the book to those who like to fully understand big data and its implications.

Network and Server Monitoring Using Open Source Tools

I am a fan of open source tools. The Internet, as we know it today, will not exist if not for the open source movement. We owe this to the countless architects and developers who dedicated their time and effort to write open source software.

Enterprise IT departments can also take advantage of open source software. Numerous companies have been using them for years. One particular area where they can be used is network and server monitoring.

There are a lot of open source network monitoring tools out there. Leading the pack are Nagios, Zabbix, and Cacti. My favorite tool though is OpenNMS. I particularly like it because it is very easy to setup and administer. It can automatically discover your nodes on the network. There were very few tweaks when I first set it up. It provides simple event and notification via email or pager. In addition, its web-based management interface interface is very easy to use.

I have been using OpenNMS for several years now and it has been running rock solid. I definitely recommend OpenNMS for IT departments who do not want to pay a hefty price to monitor their network and servers.

End User Experience on Enterprise IT

A lot of focus on adapting BYOD (Bring Your Own Devices) has been exerted by enterprise IT departments due to the popularity of mobile phones and tablets, and their cost savings to companies. However, I believe equal focus should be given to enterprise applications to enhance end user experience. Numerous enterprise applications are still antiquated, difficult to use, and not even suitable for mobile devices.

One of the goals of enterprise IT is to provide excellent user experience, thus increasing end user productivity. If the hardware devices are state of the art mobile phones and tablets but the apps are very hard to use, then the purpose is defeated.

For instance, searching for information inside the enterprise is still very difficult. Information is scattered across different file servers and applications. Very few companies have Google-like enterprise search capability. People are frustrated because it’s easier to search just about anything on the Internet, but it’s very difficult to find simple information inside the enterprise.

Enterprise applications should be like consumer IT applications, such as those provided by innovative companies like Amazon, Google, Facebook, etc. These web-based or mobile-based enterprise apps should be very user friendly and intuitive. In addition, training should not be required to use these enterprise apps. Google does not ask us to train whenever they deploy a new consumer app.

Enterprise apps should also be secure, just like those provided by online banking sites. Data should be encrypted and users properly authenticated.

End users should have the same user experience when at home doing online shopping, banking, and searching, and when at work using enterprise applications.

IT Converged Infrastructure

Is converged infrastructure the future? Major technology companies are now offering integrated compute, storage, and network in a box. Leading the pack is the Vblock system by VCE. Vblock consists of hardware and software from Cisco, EMC, and VMware.

Similarly, servers, storage, and network vendors are also offering their own integrated system. NetApp, a storage vendor, is selling FlexPod. A FlexPod combines NetApp storage systems, Cisco Unified Computing System servers, and Cisco Nexus fabric into a single, flexible architecture.

Cisco, a networking company, has been selling x86 Unified Computing System for years and recently bought Whiptail, a high performance storage company, to enhance their unified infrastructure offering. HP, a server company, is offering the POD solution.

These converged infrastructure solutions are not only suited for small or medium sized data centers but they are engineered for large scale, high performance, and highly reliant data centers. In addition, security, automation, and monitoring are built into the package.

With these solutions, companies do not need to spend time and money architecting and integrating servers, storage, and networks. Most importantly, operations and vendor support will be simplified. There will only be one point of contact for vendor support and finger pointing between vendors will be minimized.

The Evolving Role of IT Professionals

I started my career in software development.  I wrote codes, performed system analysis, and software quality assurance.   Then I switched to system and network administration, and infrastructure architecture.   While the roles of software developers may not change that much (software programs need to be created), the roles of IT administrators, architects, analysts, and IT departments in general are changing.  This is due to cheap hardware, smarter software and appliances, and the availability of the cloud.

I still remember some time ago when I would spend a lot of time troubleshooting a system.  Today, due to redundant systems, off the shelves and online applications, and the use of appliances, troubleshooting times have been reduced to a minimum.  When a component breaks, it’s easy to replace it.

IT companies are now selling converged network, server, and storage in a box which eliminated the need for elaborate architecture and implementation and has simplified IT operations.

With virtualization and the “cloud”, more and more applications and IT services (infrastructure as a service, software as a service, etc.) are being available online.

When it comes to IT, companies now have various choices – host their IT services externally via public cloud, build IT systems in house, or use the combination of the two.

Thus, the future role of IT professionals will be like brokers.  When the business comes to them for a need, they should be able to deliver quickly and provide the best IT solution.  They should be able to determine when to use the public cloud and when to use internal IT systems.  The key is to understand the business. For instance, it may not make sense to put data in the cloud if you are concerned about security or if your company is regulated by the government.  If your company is small, it may not make sense to build a costly IT infrastructure in house.

Successful IT professionals are not only technically savvy but also business savvy.

Security Done Right

During my job-related trip to Israel a couple of months ago, I was subjected to a thorough security check at the airport. I learned later on that everybody goes through the same process. It was a little inconvenient, but in the end, I felt safe.

With all the advance technologies in security, nothing beats the old way of conducting security – thorough checks on individuals. I also noticed the defense in depth strategy at the Israel airport – the several layers of security people have to pass to get to their destinations. No wonder some of the greatest IT security companies come from Israel (e.g. Checkpoint Firewall).

As an IT security professional (I’m a CISSP certified), I can totally relate to the security measures Israel has to implement. And companies need to learn from them. Not a day goes by that we learn companies being hacked, shamed, and extorted by hackers around the world.

Sadly, some companies only take security seriously when it’s too late – when their data has been stolen, their systems have been compromised, and their twitter account has been taken over. It will be a never ending battle with hackers, but it’s a great idea to start securing your systems now.

Backing Up NetApp Filer on Backup Exec 2012

The popularity of deduped disk-based backup, coupled with snapshots and other technologies, may render tape backup obsolete. For instance, if you have a NetApp Filer, you can use snapshot technology for backup, and snapmirror technology for disaster recovery. However, there may be some requirements such as regulatory requirements to keep files for several years, or infrastructure limitations such as low bandwidth to remote DR (disaster recovery) site that inhibits nightly replication. In these instances, using tape backup is still the best option.

The proper way to backup a NetApp Filer to tape on Backup Exec 2012 is via NDMP. You can backup your Filer on the network, using remote NDMP. If you can directly connect a tape device to the NetApp Filer, that would even be better, because backup will not go through the network anymore, thus backup jobs will be faster.

However, using NDMP requires a license on Backup Exec. The alternative way to backup the Filer without buying the NDMP license is via the CIFS share. Configuring the Backup Exec 2012 via CIFS shares though can be a little tricky. These are the things you need to do to make it work:

1. Disable NDMP service on the NetApp Filer. This is done by issuing the command “ndmpd off” at the command line.
2. Change the default NDMP port number on the Backup Exec 2012 server. The default port number is 10000. You may use port 9000. This is done by editing the “services” file located at C:\Windows\system32\drivers\etc and adding the line “ndmp 9000/tcp” Reboot server after editing the file.
3. Make sure you have at least one Remote Agent for Windows license installed on your Backup Exec server.
4. Make sure that the “Enable selection of user shares” is checked in the “Configuration and Settings -> Backup Exec Settings -> Network and Security” settings.
5. When defining the backup job, select “File Server” at the type of server to backup.
6. When entering the NetApp Filer name, use IP address or the fully qualified domain name (FQDN).

The backup status for backing up NetApp Filer this way will always be “Completed with Exceptions,” since Backup Exec still looks for remote agent on the client. But this is fine, as long as all files are being backed up.

Teaching Kids to Program

Should we teach our kids computer programming? I believe we should, even though their future careers will not be in computers. Computer programming teaches kids logic, mathematics and computation, design, and creativity — skills that are necessary in any chosen profession.

Many will argue that kids these days are very computer savvy. They can easily figure out how an app on a computer, tablet, or iPhone works. I totally agree with them. However, for the most part, they are consumers or users of the technology. Being a creator is totally different. Creating or programming an app is a skill that is learned and developed over the years.

Recently, I took on the task of teaching my eleven year old daughter computer programming, since her school is not teaching them programming. At least not yet. I believe that the earlier you teach your kids computer programming, the better they will be. It’s not that I wanted my daughter to be a nerd, or take up a computer career. I just wanted her to learn a very valuable skill — a skill that will be very useful for her future. We all know that the future will be dominated by computer technology.

Teaching kids to program is easier than you think. There is a program called Scratch that was created by MIT to basically teach kids or any beginner to program. From their website: “Scratch is a programming language that makes it easy to create your own interactive stories, animations, games, music, and art — and share your creations on the web.”

I used a book called Super Scratch Programming Adventure!: Learn to Program By Making Cool Games by the Lead Project to teach my daughter Scratch. I was glad that she got totally engaged in Scratch. Up next, Python programming for kids.