Scroll to Top

LSI Blog

Software Defined Data Center

When looking at the near future of data center technology, there are two very important trends to consider. First—the adoption of public and private cloud computing continues to become much more pervasive. Enterprises, software developers, and home users are all making the transition to cloud-based models for services and storage.

Second—devices, data, and network demand are all projected to grow at explosive rates over the next few years. By 2020, the digital universe – the data we create and copy annually – will reach 44 zettabytes, or 44 trillion gigabytes1.

Read more

Mad Scientist in his Flash Analytics Lab

For many years LSI has known the importance of truly understanding the complexities of interfacing with NAND flash memory to optimize its performance and lifetime. For that reason LSI created a group focused on characterizing NAND flash behavior as it interfaces with LSI flash controllers. I recently spoke to LSI’s expert in this area, Bill Hunt, Engineering Director Flash Analytics at LSI, to better understand what his group produces for LSI and how that translates into better solutions for our customers.

Read more

Ever since SandForce introduced data reduction technology with the DuraWrite™ feature in 2009, some users have been confused about how it works and questioned whether it delivers the benefits we claim. Some even believe there are downsides to using DuraWrite with an SSD. In this blog, I will dispel those misconceptions.

Data reduction technology refresher
Four of my previous blogs cover the many advantages of using data reduction technology like DuraWrite:

A couple of years ago I got a DSLR (digital single-lens reflex) camera.  After using a compact digital camera, the DSLR opened a new world of photography for me. It was great to have the option to shoot six frames per second, use different lenses and fine-tune shutter speed, exposure and other parameters.

Learning to take my photography to a higher level, from auto to manual settings, was quite an experience.  Through research and talking to friends and photographers, I discovered that I needed to learn these fundamentals:

  • Shutter Speed (time the sensor is exposed to light)
  • Aperture (how much light the lens will allow in)
  • ISO (sensor sensitivity to light)

Experimenting with each of these variables was a frustrating test of Murphy’s Law.

Read more

I started working years ago to engage large datacenters, learn what their problems are and try to craft solutions for their problems. It’s taken years, but we engaged them, learned, changed how we thought about storage and began creating solutions that are being deployed at scale.

We’ve started to do the same with the Chinese Internet giants. They’re growing at an incredible rate.  They have similar problems, but it’s surprising how different their solution approaches are. Each one is unique. And we’re constantly learning from these guys.

Read more

When implementing an LSI Nytro WarpDrive (NWD) or Nytro MegaRAID (NMR) PCIe flash card in a Linux server, you need to modify quite a few variables to get the best performance out of these cards.

In the Linux server, device assignments sometimes change after reboots. Sometimes, the PCIe flash card can be assigned /dev/sda. Other times, it can be assigned /dev/sdd, or any device name. This variability can wreak havoc when modifying the Linux environment variables. To get around this issue, assignments by the SCSI address should be used so all of the Linux performance variables will persist properly across reboots.

Read more

My “Size matters: Everything you need to know about SSD form factors” blog in January spawned some interesting questions, a number of them on Z-height.

What is a Z-height anyway?
For a solid state drive (SSD), Z-height describes its thickness and is generally its smallest dimension. Z-height is a redundant term, since Z is a variable representing the height of an SSD. The “Z” is one of the variables – X, Y and Z, synonymous with length, width and height – that describe the measurements of a 3-dimensional object.

Read more

Big data, it’s the buzz word of the year and it’s generating a lot of attention. An incalculable number of articles fervently repeat the words “variety, velocity and volume,” citing click streams, RFID tags, email, surveillance cameras, Twitter® feeds, Facebook® posts, Flickr® images, blog musings, YouTube® videos, cellular texting, healthcare monitoring …. (gasps for air). We have become a society that sweats buckets of data every day (the latest estimates are approximately 34GB per person every 24 hours) and businesses are scrambling to capture all this information to learn more about us.

Read more

Software-defined datacenters (SDDC) and software-defined storage (SDS) are big movements in the industry right now. Just read the trade press or attend any conference and you’ll see that – it’s a big deal. We’re seeing for-pay vendors providing solutions, as well as strong ecosystems evolving around open source solutions. It’s not surprising why – there is a need for enterprises to deploy large scale compute clusters, and that takes either deep expertise that’s very rare, or orchestration tools that have not existed in the past.

Read more

During the past few years, the deployment of cloud architectures has accelerated to support various consumer and enterprise applications such as email, word processing, enterprise resource planning, customer relationship management and the like. Traditionally, co-located servers, storage and networking moved to the cloud en masse in the form of a service, with overlying applications that have been and remain very insensitive to delay and jitter.

But the fast-emerging next generation of business applications require much tighter service level agreements (SLA) from cloud providers.

Read more