Author Archives: admin

Data At Rest Encryption

When the Internet was invented several decades ago, security was not in the minds of the pioneers. TCP/IP, the protocol used to send data from one point to the next was inherently insecure. Data are being sent over the wire in clear text. Today, advances in encryption technologies enabled the data to be secure while in transit. When you shop at reputable websites, for instance, you can be sure that the credit card number you send over the Internet is encrypted (You will see https on the URL instead of http). Most web applications now (such as gmail, facebook, etc) are encrypted.

However, most of these data, when stored on the servers (data at rest) are still not encrypted. That’s why hackers are still able to get hold of these precious data, such as personally identifiable information (PII) – credit card numbers, social security numbers, etc. as well as trade secrets and other company proprietary information. There are a lot of ways to secure data at rest without encrypting them (such as using better authentication, better physical security, firewalls, using secured applications, better deterrent to social engineering attacks, etc.), but encrypting data at rest is another layer of security to make sure data is not readable when hackers get a hold of them.

The demand for encrypting data at rest is growing, especially now that more data are being moved to the cloud. Enterprise data centers are also being required to encrypt data on their storage systems, either by business or compliance need.

Luckily, IT storage companies such as EMC, NetApp, and many others are now offering encryption for data at rest on their appliances. However, encrypting data is still expensive. Encrypting and decrypting data need a lot of processing power. Moreover, adding encryption to the process may slow down the access of data. Better key management system is also needed. For instance, when using the cloud for storage, data owners (as opposed to service providers) should solely possess the keys and should be able to manage the keys easily.

The Internet will be more secure if data is encrypted not only during transit but also during storage.

IT Infrastructure for Remote Offices

When designing the IT infrastructure (servers, storage, and network) of small remote offices, infrastructure architects of large enterprises are often faced with the question, what is the best IT infrastructure solution for remote sites? Low-cost, simple, secure, and easy to support solution always come to mind, but positive end-user experience in terms of network and application performance, and user friendliness should also be in the top priorities when building the infrastructure.

Most small sites just need access to enterprise applications and to file and print services. Network infrastructure definitely needs to be built – such as the site’s local area network (LAN), wireless access points, wide area network (WAN) to connect to the enterprise data center, and access to the Internet. The bigger question though is: should servers and storage be installed on the site?

There are a lot of technologies such as WAN accelerators and “sync and share” applications that will forgo installing servers and storage on the remote sites without sacrificing positive end-user experience. For instance, Riverbed WAN accelerator products tremendously improve performance access to files and applications from the remote sites to the enterprise data center.  These products can even serve up remote datastore for VMware farms. “Sync and share” applications are dropbox-like applications (such as EMC Syncplicity). Enterprises can build a storage as a service solution in their internal infrastructure. This will eliminate the need to install file servers or storage appliances on the remote sites.

The decision to “install servers” or “go serverless” at the remote sites still depends on many factors. They should be dealt with on a case by case basis and should not rely on a cookie cutter solution. Some of the criteria to consider are: the number of people at the sites and the growth projection; the storage size requirement, available WAN bandwidth, the presence or absence of local IT support, office politics, and country/region specific regulation for data to remain local. If these issues are factored in, a better solution can be designed for remote offices.

Big Data

There is too much hype on big data these days, promising the next big revolution in information technology which will change the way we do business. It purports to have a big impact on economy, science, and society at large. In fact, big data right now is at the “peak of inflated expectations” on the Gartner technology hype cycle.

Big data “refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it.” It answers questions that are sometimes not so obvious.

Big data definitely has tremendous potential. After all the hype has subsided, entities that do not take advantage of its power will be left out. In fact big data is already being used by technology companies such as Google, Amazon, Facebook, and many other companies. IT vendors such as Oracle, EMC, and IBM started offering big data solutions for companies and enterprises.

There are three drivers that is making big data possible:

First, a robust and cheap IT infrastructure – powerful server platforms that crunch data, advanced storage systems that store huge amount of data, and ubiquitous network – Wifi, 4G, fiber, etc.

Second, the explosion of data from mobile devices, social networks, web searches, sensors, and data from many different devices.

Lastly, the proliferation of powerful analytics and data mining tools suited for big data, such as Hadoop, MapReduce, NoSQL, and many other software yet to be created. These tools will only get better and better.

I recently read the book entitled “Big Data: A Revolution That Will Transform How We Live, Work, and Think” by Viktor Mayer-Schönberger and Kenneth Cukier.

The book is spot on its predictions. With big data, there will be yet another paradigm shift on how we understand the world. With big data, “what” is more important than “why”. Big data is also the processing of complete data, not just a sampling of data. It also means accepting less than perfect accurate result.

The book also talks about the dark side of big data – such as the loss of privacy. It also talks about how big data predictions can be used to police and punish individuals, and how organizations may blindly defer to what the data says without understanding its limitations.

I highly recommend the book to those who like to fully understand big data and its implications.

Toastmasters Is Also About Leadership

Many people join Toastmasters Club to improve their communications skills. But Toastmasters is not only about communications; it’s also about leadership. There is a leadership program that members can take advantage of to improve their leadership skills. In fact, before a member can become a distinguished Toastmaster – the highest Toastmaster educational award – one needs to complete both the leadership and communications tracks.

It makes sense that communications and leadership skills go hand in hand. Great communicators are great leaders, and great leaders are great communicators. Many areas of our society require leaders. People just need to step up and lead.

In Toastmasters, there are many opportunities to lead at the club, district, and international levels, thus improving our leadership skills. When I became a club president a year ago, I learned so many things including organizing events, motivating people, and managing the club. Now that I am an area governor, I have to face a new set of challenges, thus more opportunities to learn and lead.

Network and Server Monitoring Using Open Source Tools

I am a fan of open source tools. The Internet, as we know it today, will not exist if not for the open source movement. We owe this to the countless architects and developers who dedicated their time and effort to write open source software.

Enterprise IT departments can also take advantage of open source software. Numerous companies have been using them for years. One particular area where they can be used is network and server monitoring.

There are a lot of open source network monitoring tools out there. Leading the pack are Nagios, Zabbix, and Cacti. My favorite tool though is OpenNMS. I particularly like it because it is very easy to setup and administer. It can automatically discover your nodes on the network. There were very few tweaks when I first set it up. It provides simple event and notification via email or pager. In addition, its web-based management interface interface is very easy to use.

I have been using OpenNMS for several years now and it has been running rock solid. I definitely recommend OpenNMS for IT departments who do not want to pay a hefty price to monitor their network and servers.

End User Experience on Enterprise IT

A lot of focus on adapting BYOD (Bring Your Own Devices) has been exerted by enterprise IT departments due to the popularity of mobile phones and tablets, and their cost savings to companies. However, I believe equal focus should be given to enterprise applications to enhance end user experience. Numerous enterprise applications are still antiquated, difficult to use, and not even suitable for mobile devices.

One of the goals of enterprise IT is to provide excellent user experience, thus increasing end user productivity. If the hardware devices are state of the art mobile phones and tablets but the apps are very hard to use, then the purpose is defeated.

For instance, searching for information inside the enterprise is still very difficult. Information is scattered across different file servers and applications. Very few companies have Google-like enterprise search capability. People are frustrated because it’s easier to search just about anything on the Internet, but it’s very difficult to find simple information inside the enterprise.

Enterprise applications should be like consumer IT applications, such as those provided by innovative companies like Amazon, Google, Facebook, etc. These web-based or mobile-based enterprise apps should be very user friendly and intuitive. In addition, training should not be required to use these enterprise apps. Google does not ask us to train whenever they deploy a new consumer app.

Enterprise apps should also be secure, just like those provided by online banking sites. Data should be encrypted and users properly authenticated.

End users should have the same user experience when at home doing online shopping, banking, and searching, and when at work using enterprise applications.

Migrating Files to EMC VNX

There are several methodologies in migrating files to EMC VNX. One method I used recently to migrate Windows files (CIFS) was to copy files between the source CIFS server to the target VNX server using the emcopy migration suite of tools from EMC. EMC provides these free tools including emcopy, lgdup, and sharedup to accomplish the migration task. There are several steps you need to follow for a successful migration. In general, I used the following procedure:

1. Create the necessary VDM (Virtual Data Mover), File System, and CIFS server on the target VNX machine.
2. Create CIFS share. Copy the share permissions (or ACL) and the NTFS root folder ACLs from the old share to new share. You can also use the sharedup.exe utility.
3. Use lgdup.exe utility to copy local groups from the source CIFS server to the target CIFS server.
4. Run emcopy.exe to perform baseline copy.
5. Create an emcopy script to sync the files every night. This will temendously cut the time needed to update the files on the final day of migration.
6. Analyze emcopy log to make sure files are being copied successfully. You may also spot check the ACLs and/or run tools to compare files and directories between the source and target.
7. On the day of the cutover:
a. Disconnect users from the source CIFS server and make the file system read-only.
b. Run the final emcopy script.
c. Follow EMC156835 to rename the CIFS server so that the new CIFS server will have the name of its old CIFS server. This procedure entails unjoining the source and target CIFS server from Active Directory (AD), renaming NetBIOS name on the new CIFS server, joining back the CIFS server, etc. Update the DNS record too if necessary.
8. Check the new CIFS shares and make sure the users are able to read/write on the share.

To migrate UNIX/Linux files, use the UNIX rsync utility to copy files between source and target VNX.

Moving a Qtree Snapmirror Source in NetApp Protection Manager

A couple of weeks ago, one of the volumes in our NetApp Filer storage almost ran out of space. I cannot expand the volume since its aggregate is also low in space. I have to move it to a different volume contained in an aggregate with plenty of space on the same filer. The problem is, this volume contained qtrees that are snapmirrored to our Disaster Recovery (DR) site and it is being managed by NetApp Protection Manager. How do I move the qtree snapmirror sources without re-baselining the snapmirror relationship using Protection Manager? Unfortunately, there is no way to do this using Protection Manager, and re-baselining is not an option – it has terabytes of data that may take couple of weeks to complete.

Like any sane IT professional, I googled how to do this. I did not find a straight forward solution, but I found bits and pieces of information. I consolidated this information and generated the steps below. Generally, the steps below are the combination of the snapmirror CLI commands and Protection Manager configuration tasks.

1. On the CLI of the original source filer, copy the original qtree to a new qtree on a new volume by using the following command:

sourcefiler> snapmirror initialize -S sourcefiler:/vol/oldsourcevol/qtree sourcefiler:/vol/newsourcevol/qtree

This took some time, and I also updated snapmirror.conf so that snapmirror updates daily.

2. On the day of the cutover, perform a final snapmirror update on the new volume. Before doing this, make sure that nobody is accessing the data by removing the share.

sourcefiler> snapmirror update sourcefiler:/vol/newsourcevol/qtree

3. Login to the Operations Manager server, and run the following on the command prompt:

c:\> dfm option set dpReaperCleanupMode=Never

This prevents Protection Manager’s reaper cleaning up any relationship.

4. Issue the following command to relinquish the primary and secondary member:

c:\> dfpm dataset relinquish destination-qtree-name

This will mark the snapmirror relationship as external and Protection Manager will no longer manage the relationship.

5. Using NetApp Management Console GUI interface, remove the primary member from the dataset. Then remove the secondary member.

6. On the CLI of the source filer, create a manual snapshot copy by using the following command:

sourcefiler> snap create oldsourcevol common_Snapshot

6. Update the destinations by using the following commands:

sourcefiler> snapmirror update -c common_Snapshot -s common_Snapshot -S sourcefiler:/vol/oldsourcevol/qtree sourcefiler:/vol/newsourcevol/qtree

destinationfiler> snapmirror update -c common_Snapshot -s common_Snapshot -S sourcefiler:/vol/oldsourcevol/qtree destinationfiler:/vol/destinationvol/qtree

7. Quiesce and break the SnapMirror relationship between source filer and destination filer, and oldsource and newsource volumes, using the following commands:

destinationfiler> snapmirror quiesce /vol/destinationvol/qtree
destinationfiler> snapmirror break /vol/destinationvol/qtree
sourcefiler> snapmirror quiesce /vol/volnewsourcevol/qtree
sourcefiler> snapmirror break /vol/volnewsourcevol/qtree

8. Establish the new snapmirror relationship using the following command on the destination system:

destinationfiler> snapmirror resync -S sourcefiler:/vol/newsourcevol/qtree destinationfiler:/vol/destinationvol/qtree

The new SnapMirror relationship automatically picks the newest common Snapshot copy for replication. This is the common Snapshot copy.

9. Verify that the SnapMirror relationship is resynchronizing by using the following command:

destinationfiler> snapmirror status

10. Recreate the shares on the new source volume.

11. At this point, on the Protection Manager GUI console, you will see the snapmirror relationship in the External Relationship tab.

12. Create a new dataset with required policy and schedule. Use the import wizard to import the snapmirror relationship to the new dataset.

13. On the Operations Manager server command prompt, set back the reaper cleanup mode back to orphans.

c:\> dfm options set dpReaperCleanupMode=orphans.

Please send me a note if you need more information.

Sources

https://library.netapp.com/ecmdocs/ECMP1196991/html/GUID-301E424F-62C5-4C89-9435-F13202A1E4B6.html
https://communities.netapp.com/message/44365

IT Converged Infrastructure

Is converged infrastructure the future? Major technology companies are now offering integrated compute, storage, and network in a box. Leading the pack is the Vblock system by VCE. Vblock consists of hardware and software from Cisco, EMC, and VMware.

Similarly, servers, storage, and network vendors are also offering their own integrated system. NetApp, a storage vendor, is selling FlexPod. A FlexPod combines NetApp storage systems, Cisco Unified Computing System servers, and Cisco Nexus fabric into a single, flexible architecture.

Cisco, a networking company, has been selling x86 Unified Computing System for years and recently bought Whiptail, a high performance storage company, to enhance their unified infrastructure offering. HP, a server company, is offering the POD solution.

These converged infrastructure solutions are not only suited for small or medium sized data centers but they are engineered for large scale, high performance, and highly reliant data centers. In addition, security, automation, and monitoring are built into the package.

With these solutions, companies do not need to spend time and money architecting and integrating servers, storage, and networks. Most importantly, operations and vendor support will be simplified. There will only be one point of contact for vendor support and finger pointing between vendors will be minimized.

Information Security Conference

I recently attended the 2013 (ISC)2 Annual Security Congress held at Chicago, IL on Sept 23 to 27. The conference was held in conjunction with the ASIS International Security conference. It was one of the premier conference attended by security professionals from all over the world. The conference was a huge success.

I attended the conference to primarily obtain CPE (Continuing Professional Education) points for my CISSP (Certified Information Systems Security Professional) certification, to learn from experts on the latest technologies and trends in information security, and to network with information security professionals.

The keynote speeches were informative, entertaining, and inspirational. Steve Wozniak (co-founder of Apple computers) talked about how he got into the world of computing and that hacking – for the sake of learning, inventing, and developing programs – should be fun. Former Prime Minister of Australia, Hon. John Howard, talked about the qualities of a great leader and the state of the world economy. Mike Ditka (an NFL legend), delivered an inspirational speech on attitude and success.

The sessions on information security varied widely from governance to technical deep-dive on security tools. Hot topics included cloud security, mobile security, hackers, privacy, and end user awareness. What struck me most was that the reason why there are still a lot of security breaches despite the advances in technologies is that security is often an afterthought for most companies – defence-in-depth is not properly implemented, programmers write insecure programs (for instance, they don’t write programs that checks for SQL injections), and users are not properly trained on security (such as how to use a good passwords, not to click phishing site sent via email, etc).

The world of information security is expanding. As more and more people are using the Internet and more companies are doing business online, the need for security becomes even more important.