Advertisment

AWS Announces Cloud Data Warehouse Service Called Amazon Redshift

author-image
PCQ Bureau
New Update





Advertisment

Las Vegas — Amazon Web Services today announced the limited preview of Amazon Redshift, a fast fully managed, petabyte-scale data warehouse service in the cloud. This enables customers to increase the speed of query performance when analyzing virtually any size data set, using the same SQL-based business intelligence tools they use today. With a few clicks in the AWS Management Console, customers can launch a Redshift cluster, starting with a few hundred gigabytes and scaling to a petabyte or more, for under $1,000 per terabyte per year — one tenth the price of most data warehousing solutions available to customers today.

Self-managed, on-premise data warehouses require significant time and resource to administer, especially for large datasets.  Loading, monitoring, tuning, taking backups, and recovering from faults are complex and time-consuming tasks.  And, the financial cost associated with building, maintaining, and growing traditional data warehouses is flat-out expensive.  Larger companies have resigned themselves to paying such a high cost for data warehousing, while smaller companies often find the hardware and software costs prohibitively expensive, leaving most of these organizations without a data warehousing capability.  Amazon Redshift aims to change this quagmire. 

Advertisment

Amazon Redshift manages all of the work needed to set up, operate, and scale a data warehouse, from provisioning capacity to monitoring and backing up the cluster, to applying patches and upgrades. Scaling a cluster to improve performance or increase capacity on Amazon Redshift is simple and incurs no downtime, while the service continuously monitors the health of the cluster and automatically replaces any component needed.  Amazon Redshift is also priced cost-effectively (a fraction of existing data warehouses) to enable larger companies to substantially reduce their costs and smaller companies to take advantage of the analytic insights that come from using a powerful data warehouse.

“Over the past two years, one of the most frequent requests we've heard from customers is for AWS to build a data warehouse service,” said Raju Gulabani, Vice President of Database Services, AWS.  “Enterprises are tired of paying such high prices for their data warehouses and smaller companies can't afford to analyze the vast amount of data they collect (often throwing away 95% of their data).  This frustrates customers as they know the cloud has made it easier and less expensive than ever to collect, store, and analyze data.  Amazon Redshift not only significantly lowers the cost of a data warehouse, but also makes it easy to analyze large amounts of data very quickly. 

While actual performance will vary based on each customers' specific query requirements, our internal tests have shown over 10 times performance improvement when compared to standard relational data warehouses. Having the ability to quickly analyze petabytes of data at a low cost changes the game for our customers.”  

Advertisment

Amazon Redshift uses a number of techniques, including columnar data storage, advanced compression, and high performance IO and network, to achieve significantly higher performance than traditional databases for data warehousing and analytics workloads. By distributing and parallelizing queries across a cluster of  inexpensive nodes, Amazon Redshift makes it easy to obtain high performance without requiring customers to hand-tune queries, maintain indices, or pre-compute results. Amazon Redshift is certified by popular business intelligence tools, including Jaspersoft and MicroStrategy.   Over twenty customers, including Flipboard, NASA/JPL, Netflix, and Schumacher Group, are in the Amazon Redshift private beta program.

Amazon Redshift includes technology components licensed from ParAccel and is available with two underlying node types, including either 2 terabytes or 16 terabytes of compressed customer data per node.  One cluster can scale up to 100 nodes and on-demand pricing starts at just $0.85 per hour for a 2-terabyte data warehouse, scaling linearly up to a petabyte and more. Reserved instance pricing lowers the effective price to $0.228 per hour or under $1,000 per terabyte per year — less than one tenth the price of comparable technology available to customers today.


Advertisment

Stay connected with us through our social media channels for the latest updates and news!

Follow us: