by May 2, 2011 0 comments



According to IDC, the digital universe grew to 2.3 million petabytes of data last year (1 PB=1000 TB). This data is expected to grow 45-fold by 2020. We’re all contributing to this stupendous growth in data, so the big question really is about how should you design your IT infrastructure to manage the growing volume of data in your organization? Dell organized an event for its key SME customers last month to discuss the technologies and solutions that are available to manage data. The highlight of the event was a panel discussion to bring out the issues and challenges faced by Indian SMEs on data management and discuss the solutions to manage the same (see photo for panelist details).

Presented here are the key points that were discussed. A key contributor to the growth in data is the increasing number of applications in your data center. As your business scales up, you need to add new applications and upgrade existing ones to handle the growth. Accounting and inventory systems upgrade to ERP for instance, and as sales volumes grow, a CRM application comes in to handle incoming leads, provide better service to existing customers, etc. As the data grows, you need a BI solution to make sense of all this data to chalk out key trends to take key business decisions.

How do you scale your IT infrastructure to handle so many applications? You can’t continue adding more servers indefinitely. Here’s where virtualization come into play. It would allow you to move multiple applications into a virtual environment so that they all run on a single server. But life doesn’t get any simpler once you’ve deployed virtualization. You need solutions to manage a virtualized environment. Virtual machines overload, security issues, single points of failure are just a few things you’ll have to manage.


There are solutions available to handle all these issues. For instance, one particular solution allows you to migrate an existing application to a virtual environment, and even do vice versa as well.

Use Information Lifecycle Management

According to a study done by the University of California in Santa Cruz, 90% of the data is never accessed after being opened for the first time! So imagine that there’s so much data lying in your data center, on the users’ desktops, and in the cloud, and most of it is hardly ever accessed. The irony is that you can’t get rid of it, because you never know when you’ll really need it. What you have to plan therefore to put in a strategy for managing this data, and use the right solutions to put the right data in the right storage technology. Frequently accessed data should be placed on high-speed storage, while rarely accessed data should be archived, and everything in between should be moved to the appropriate storage technology. Plus of course, there’s the trouble of removing redundant data, doing version control of data that’s getting updated frequently, and so on. That’s in essence, what Information Lifecycle Management aims to address, and today’s storage technologies are smart enough to do all that.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.

<