How did he do that?
Growing up, I watched a little TV. Okay, a lot of TV as I did not have my DVR or iPad and a man who would one day occupy the White House as VP had not yet invented the Internet. Of the many shows I watched, MacGyver was one of my favorites. He would take ordinary objects and use them to solve complicated problems in a way no one could have imagined. Out of all the things he used, his trusty Swiss army knife was the most awesome. With all its blades, tools and accessories, it could solve multiple problems at the same time. It was easy to use, did not take up a lot of space and was very cost-effective.
Nytro MegaRAID – the Swiss Army knife of server storage
LSI has its own multi-function, get-yourself-out-of-a-fix workhorse – the Nytro MegaRAID® card, part of the Nytro product family. It combines caching intelligence, RAID protection and flash on a single PCIe® card to accelerate applications, so it can be deployed to solve problems across a broad number of applications.
A feature for every challenge!
The Nytro MegaRAID card is built on the same trusted technology as the MegaRAID cards deployed in datacenters worldwide. That means, it is enterprise architected and hardened and datacenter tested. Its Swiss Army knife-like features include, as I mentioned, on-board flash storage that can be configured to monitor the flow of data from an application to the attached RAID protected storage, intelligently identify hot, or the most frequently accessed, data, and automatically move a copy of that data to the flash storage to accelerate applications. The next time the application needs that data, the information is fetched from flash, not the much slower traditional hard disk drive (HDD) storage.
Hard drives can lead to slowdowns in another way, too, when the mechanics wear out and fail. When they do, your storage (and application) performance can dramatically decrease – in a RAID storage environment, this is called degraded mode. The good news is that the Nytro MegaRAID card stores much of an application’s frequently used data in its intelligent flash based cache, boosting the performance of a connected HDD in degrade mode by as much as 10x, depending on the configuration. The Swiss Army knife follow-on benefit is that when you replace the failed drive, Nytro MegaRAID speeds RAID storage rebuilds by as much as 4x. RAID rebuilds add to IT admin time, and IT time is money, so that’s money you get to keep in your pocket.
The Nytro MegaRAID card also can be configured so you can use half of its onboard flash as a pair of mirrored boot drives. In big data environments, this mirroring frees up two boot drives for use as data storage to help increase your server storage density (aka available storage capacity), often significantly, while dramatically improving boot time. What’s more, that same flash can be deployed instead as primary storage to complement your secondary HDD storage with higher speeds, providing a superfast repository for key files like virtual desktop infrastructure (VDI) golden images or key database log files.
One MacGyver Swiss Army knife, one Nytro MegaRAID card – both easy-to-use solutions for a number of complex problems.
Tags: application acceleration, big data, data protection, database, flash, flash card, flash-based cache, hard disk drive, HDD, MacGyver, Nytro MegaRAID card, PCIe card, RAID, server storage, Swiss Army knife, VDI, virtual desktop infrastructure
Where did my email go?
This week I was dragged into to the virtualized cloud kicking and screaming … well, sort of. LSI has moved me, and all my co-workers, from my nice, safe local Exchange server to one in the amorphous, mysterious cloud. Scary. My IT department says the new cloud email is a great thing. They are now promising me unlimited email storage. Long gone will be the days of harrowing emails telling me I am approaching my storage limit and soon will be unable to send new email.
With cloud email, I can keep everything forever! I am not quite sure that saving mountains of email will be a good thing :-). Other than having to redirect my tablet and smartphone to this new service, update my webmail bookmark and empty my email inbox, there was not much I had to do. So far, so good. I have not experienced any challenges or performance issues. A key reason is flash storage.
To be sure, virtualization is a great tool for improving physical server utilization and flexibility as well as reducing power, cooling and datacenter footprint costs. That’s why the use of virtualization for email, databases and desktops is growing dramatically. But virtualized servers are only as effective as the storage performance that supports them. If, as a datacenter manager, your clients cannot access their application data quickly or boot their virtual desktop in a reasonable time, your company’s productivity and client satisfaction can drop dramatically.
Today, applications most likely run on virtualized servers. The upside of server virtualization is that a company can improve server utilization and run more applications on fewer physical servers. This can reduce power consumption, make more efficient use of datacenter floor space and make it easier to configure servers and deploy applications. The cloud also helps streamline application development, allowing companies to more efficiently and cost effectively test software applications across a broad set of configurations and operating systems.
A heated dispute – storage contention
Once application testing is complete, a virtual server’s configuration and image can be put on a virtual shelf until they are needed again, freeing up memory, processing, storage and other resources on the physical server for new virtual servers with just a few keystrokes. But with virtualization and the cloud there can be downsides, like slow performance – especially storage performance.
When a number of virtual servers are all using the same physical storage, there can be infighting for storage capacity, generally known as storage contention. These internecine battles can slow application response to a frustrating glacial pace and lead to issues like VDI Boot and Login Storm that can extend the time it takes for users to login to tens of minutes.
Flash helps alleviate slowdowns in storage performance
Here is where flash comes to the rescue. New flash storage solutions are being deployed to help improve virtualized storage performance and alleviate productivity-sapping slowdowns caused by VDI Boot and Login Storm — the crush of end users booting up or logging in within a small window that overwhelms the server with data requests and degrades response times. Flash can be used as primary storage inside servers running virtual machines to dramatically speed storage response time. Flash can also be deployed as an intelligent cache for DAS- or SAN-connected storage and even as an external shared storage pool.
It’s clear that virtualization will require higher storage performance and better, more cost-effective ways to deploy flash storage. But how much flash you need depends on your particular virtualization challenge, configuration and of course budget: while flash storage is extremely fast, it is costlier than disk-based storage. So choosing the right storage acceleration solution – one is LSI® Nytro™ Application Acceleration – can be as important as choosing the right cloud provider for your company’s email.
While my email is now stored in the cloud in Timbuktu, I know the flash storage solutions in that datacenter help keep my mail quickly accessible 24/7 whether I access it from my computer, tablet or smartphone, giving my productivity a boost. I can be assured that every action item I am sent will quickly make it to my inbox and be added to my ever-growing to-do list. Now my next big challenge is to improve my own response performance to those email requests!
August was always an exciting time at my childhood home. We were excited that was school was starting in September and mom was relieved that summer was coming to an end. I remember the annual trips to the local department stores to buy school clothes every year. It was always exciting to pick out a new school clothing and a new winter coat. With only a few stores to choose from, many of us wore similar clothes and coats when classes started.
As consumers, we have far more fashion and store options today. There are specialty stores at the mall, big box outlets, membership stores and specialty online portals. With so many more clothing designers than in years past, retailers are also inundated with fashion choices. The question becomes, “how does the fashion chain – from textile suppliers and clothing manufacturers to the retailers themselves – choose what to carry?”
They all rely on big data to make critical decisions. Let’s go to the start of the chain: the textile manufacturer. It may analyze previous years’ orders, competitive intelligence, purchasing trend data, and raw material and manufacturing costs. While tracking analytics on one data source is relatively easy, capturing and analyzing multiple data sources can be a tremendous challenge – a point underscored in a 2012 research report from Gartner. In its analysis, Gartner found that big data processing challenges don’t come from analysis or a single data set or source but rather from the complexity of interaction between two or more data sets.
“When combining large assets and new asset types, how they relate to each other becomes more complex,” the Gartner report explains. “As more assets are combined, the tempo of record creation and the qualification of the data within use cases becomes more complex.”
The next link is the clothing companies that create the fashion. They have a much more complex job, using big data to analyze fashion trends and improve their decision-making. Information such as historical sales, weather predictions, demographic data and economic details help them chose the right colors, sizes and price points for the clothing they make.
Swim Suits and Snow Parkas
This is where we, as consumers, come into the picture. Just as I did many years ago, people still shop for school and winter clothing this time of year. The clothes on the racks at our favorite retailer or from an online catalogue were chosen and ordered 6-9 months ago. Take Kohl’s. The nationwide retailer uses a blend of geographic weather prediction data sources to know where to best sell those snow parkas versus swim suits, economic and competitive data to price it right, demographic data sources to better predict the required sizes and customer demand, and market trends data sources to better forecast the colors and styles that will sell best. The more accurately Kohl’s buyers can predict consumer behavior using big data, the less the retailer will need to discount overstock, and the higher its sales and profit.
As I stated in my previous blog posts, the Hadoop® architecture is a great tool for efficiently storing and processing the growing amount of data worldwide, but Hadoop is only as good as the processing and storage performance that supports it. As with flu strain and weather predictions, the more data you can quickly and efficiently analyze, the more accurate your prediction. When it comes to weather and flu vaccines, these predictions can help save lives, but in the fashion industry it is all about improving the bottom line.
Whether in fashion, medical, weather or other fields , the use of Hadoop for high levels of speed and accuracy in big data analysis requires computers with application acceleration. One such tool is LSI® Nytro™ Application Acceleration. You can go to TheSmarterWayToFaster™ for more information on the Nytro product family.
Part three of this three-part series continues to examine some of the diverse and potentially life-saving uses of big data in our everyday lives. It also explores how expanded data access and higher processing and storage speed can help optimize big data application performance.
Every year I diligently get in line for my annual flu (or more technically accurate “seasonal influenza”) shot. I’m not particularly fond of needles, but I have seen what the flu can do and the how many die each year from this seasonal virus.
When you get the flu shot – or, now, the nasal mist – you and I are trusting a lot of people that what you are taking will actually help protect you. According to the CDC (Centers for Disease Control and Prevention), there are 3 three strains, (A, B &C Antigenic) of influenza virus and of those three types, two cause the seasonal epidemics we suffer through each year.
Not to get too technical, but I learned that the A strain is further segregated by 2 proteins and are given code names like H1N1, H3N2 and H5N1. They can even be updated by year if there is a change in them. An example of this was in 2009, when the H1N1 became the 2009 H1N1. So where we may just call it H1N1, the World Health Organization has a whole taxonomy to describe a seasonal influenza strain.
This taxonomy includes:
As you can see, it can really get complicated quickly. If you would like to go deeper, you can read more about this here. While much of this information seems pretty arcane to the lay reader, you quickly can see that the sheer volume of information collected, stored and analyzed to combat seasonal influenza is a great example of big data.
In the US, once the CDC sifts through this data – using big data analytics tools – it uses its findings to determine what strains might affect the US and build a flu shot to combat those strains. During the 2012/2013 season, the predominant virus was Influenza A (H3N2), though some influenza B viruses contained a dash of influenza A (H1N1) pdm09 (pH1N1). (See the full report here.)
In addition to identifying dominant viruses, the CDC also uses big data to track the spread and potential effect on the population. Reviewing information from prior outbreaks, population data, and even weather patterns, the CDC uses big data analytics to quickly estimate and attempt to determine where viruses might hit first, hardest and longest so that a targeted vaccine can be produced in sufficient quantities, in the required timeframe and even for the right geography. The faster and more accurately this can be done, the more people can get this potentially life saving vaccine before the virus travels to their area.
As I stated in my previous blog post, the Hadoop® architecture is a great tool for efficiently storing and processing the growing amount of data worldwide, but Hadoop is only as good as the processing and storage performance that supports it. As with weather predictions, the more data you can quickly and efficiently analyze, the greater the likelihood of an accurate prediction. When it comes to weather and flu vaccines, these predictions can help save lives. In my final blog post in this series, I will explore how big data helps the fashion industry.
Whether in medical, weather or other fields that leverage big data technologies, the use of Hadoop for high levels of speed and accuracy in big data analysis requires computers with application acceleration. One such tool is LSI® Nytro™ Application Acceleration. You can go to TheSmarterWayToFaster™ for more information on the Nytro product family.
Part two of this three-part series continues to examine some of the diverse and potentially life-saving uses of big data in our everyday lives. It also explores how expanded data access and higher processing and storage speed can help optimize big data application performance.