http://itknowledgeexchange.techtarget.com/storage-soup/datacore-adds-support-for-logical-volumes-up-to-1-pb/
DataCore kicked off 2010 with updates to its SANSymphony and SANMelody storage virtualization software, adding support for logical volumes up to 1 PB and the Asymmetric Logical Unit Access (ALUA) standard...
DataCore director of product marketing Augie Gonzalez said that as with last year’s 1 TB “mega-cache” support, the logical limit is beyond where most customers will be looking to stretch today. But the previous 2 TB limit had grown impractical for making RAID sets out of the latest 1 TB and 2 TB SATA disks.
“The logical volume expansion and thin provisioning allows users to say, ‘I don’t care how big the volume will be in the future’,” Gonzalez said. “Rather than defining LUNs up front and then having to make changes later, you can immediately set up a large volume and expand the storage with no applicvation or infrastructure changes.”
A DataCore service provider customers says adding ALUA support will improve management in his storage environment. Joseph Stedler, director of data center engineering for cloud computing and managed IT service provider OS33, said he uses DataCore’s SANSymphony software to host back-end storage for his SMB customers. Right now SANSymphony is running on IBM System x servers in front of IBM DS3400 arrays and Xiotech Emprise 5000 storage devices, mirroring between redundant sets of the tiered hardware. The logical volume expansion will be especially helpful in cutting down on backup administration overhead, Stedler said.
“With two terabyte volumes, we had to present things in 2 terabyte chunks to our Veeam [backup] server,” he said. “With a larger primary volume we could have fewer backup targets to manage.”
Stedler said the addition of ALUA support will be even more important for creating multipath I/O in OS33’s VMware environment. “ALUA solves a major pain for everybody running DataCore with VMware,” he said. “The way VMware understood it before was active-passive only. DataCore was able to do active-active failover but with VMware you’d have to run multipathing with the most recently used path. The new release is fully compliant with ALUA, so VMware can view it as active-active.”
Stedler said he’s looking forward to the addition of more granular scripting capabilities for the software in future releases.
Monday, January 11, 2010
Thursday, January 7, 2010
Virtualization Predictions: Desktop and Storage Virtualization are the next "Big Wave"
David Marshall from Infoworld/VMblog has posted a series of interviews on 2010 predictions.
DataCore and 2 DataCore solution providers - Helixstorm and Mirazon Group share their outlook on 2010 trends. Read the full article at:
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Desktop and Storage Virtualization are the next "big wave"
DataCore Software's CEO George Teixeira shares his viewpoint on 2010. Plus, representatives from two DataCore solution advisor partners weigh in on what they see coming in the virtualization space in 2010.
George Teixeira - President & CEO of DataCore Software
Prediction: Consolidation was the driver of the first wave of virtualization. In 2010, both storage and desktop virtualization will go mainstream.
Prediction: The growing cost disparity between Hypervisors and traditional SANs and shared storage arrays, needed to support virtual infrastructures, will slow down the pace of virtualization adoption.
Prediction: Microsoft Hyper-V has created another wave - the whole Microsoft world -that is embracing virtualization and this trend will continue to accelerate in 2010.
Prediction: This will be the year of virtual desktop proof of concepts (POCs) and pilots; greatest challenge to success is overcoming the cost of storage and the ability to scale storage effectively.
Prediction: Because virtual servers are starting to become a commodity - this puts even more importance on getting the storage piece right.
DataCore and 2 DataCore solution providers - Helixstorm and Mirazon Group share their outlook on 2010 trends. Read the full article at:
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Desktop and Storage Virtualization are the next "big wave"
DataCore Software's CEO George Teixeira shares his viewpoint on 2010. Plus, representatives from two DataCore solution advisor partners weigh in on what they see coming in the virtualization space in 2010.
George Teixeira - President & CEO of DataCore Software
Prediction: Consolidation was the driver of the first wave of virtualization. In 2010, both storage and desktop virtualization will go mainstream.
Prediction: The growing cost disparity between Hypervisors and traditional SANs and shared storage arrays, needed to support virtual infrastructures, will slow down the pace of virtualization adoption.
Prediction: Microsoft Hyper-V has created another wave - the whole Microsoft world -that is embracing virtualization and this trend will continue to accelerate in 2010.
Prediction: This will be the year of virtual desktop proof of concepts (POCs) and pilots; greatest challenge to success is overcoming the cost of storage and the ability to scale storage effectively.
Prediction: Because virtual servers are starting to become a commodity - this puts even more importance on getting the storage piece right.
Wednesday, January 6, 2010
Clouds and Virtual Infrastructures - DataCore Solution Provider the Mirazon Group Predicts Virtualization in 2010
David Marshall from Infoworld/VMblog has posted interviews on 2010 predictions.
This is a post from DataCore solution provider Mirazon Group.
Check out the full article:
Desktop and Storage Virtualization are the Next "Big Wave"
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Clouds and Virtual Infrastructures
Craig Stein - Systems Architect for The Mirazon Group
Clouds and Virtual Infrastructures (not just virtual servers) that encompass virtual storage and VDI are the major opportunities for 2010:
1. The Cloud Computing concept will become more of a reality as service providers offer easy ways for customers to embrace the portability of their virtual workloads to move their VMs and storage from private clouds to hosted cloud infrastructures and back as needed.
2. Greater adoption of Microsoft Hyper-V R2 in both SMB and Enterprise markets for both development and production environments.
3. As organizations embrace Hyper-V we expect to see more products that support the backup and management of vSphere/ESX to also support the Hyper-V platform.
4. As IT budgets are cut, we expect to see a greater trend towards Storage Virtualization that allows commoditization of storage and a heterogeneous vendor approach - instead of a single vendor solution that locks the customer into their initial storage vendor.
5. We expect the VDI market will slowly yield solutions with lower CAPEX cost of Virtual Desktops. VDI is still very expensive from the Software/Thin Client/Storage (SAN) perspective. The ROI for VDI is currently realized through reduced OPEX costs. OPEX costs will also continue to improve as better management tools appear and the VDI software matures.
6. More and more organizations will realize the value of Virtualization as it relates to Disaster Recovery. Even small businesses are embracing Virtualization as the technology becomes more mainstream - simply for the recoverability.
7. In 2010, we expect the trend to continue where Application Vendors embrace virtualization of all workloads, including I/O intensive servers running SQL, Oracle, Exchange as the technology is understood, proven and trusted by the ISVs.
8. For Telephone/Communication systems, as SIP Trunks become more popular in replacing proprietary T1/E1 cards, we expect to see virtualization of voice platforms for increased uptime and DR that virtualization affords.
This is a post from DataCore solution provider Mirazon Group.
Check out the full article:
Desktop and Storage Virtualization are the Next "Big Wave"
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Clouds and Virtual Infrastructures
Craig Stein - Systems Architect for The Mirazon Group
Clouds and Virtual Infrastructures (not just virtual servers) that encompass virtual storage and VDI are the major opportunities for 2010:
1. The Cloud Computing concept will become more of a reality as service providers offer easy ways for customers to embrace the portability of their virtual workloads to move their VMs and storage from private clouds to hosted cloud infrastructures and back as needed.
2. Greater adoption of Microsoft Hyper-V R2 in both SMB and Enterprise markets for both development and production environments.
3. As organizations embrace Hyper-V we expect to see more products that support the backup and management of vSphere/ESX to also support the Hyper-V platform.
4. As IT budgets are cut, we expect to see a greater trend towards Storage Virtualization that allows commoditization of storage and a heterogeneous vendor approach - instead of a single vendor solution that locks the customer into their initial storage vendor.
5. We expect the VDI market will slowly yield solutions with lower CAPEX cost of Virtual Desktops. VDI is still very expensive from the Software/Thin Client/Storage (SAN) perspective. The ROI for VDI is currently realized through reduced OPEX costs. OPEX costs will also continue to improve as better management tools appear and the VDI software matures.
6. More and more organizations will realize the value of Virtualization as it relates to Disaster Recovery. Even small businesses are embracing Virtualization as the technology becomes more mainstream - simply for the recoverability.
7. In 2010, we expect the trend to continue where Application Vendors embrace virtualization of all workloads, including I/O intensive servers running SQL, Oracle, Exchange as the technology is understood, proven and trusted by the ISVs.
8. For Telephone/Communication systems, as SIP Trunks become more popular in replacing proprietary T1/E1 cards, we expect to see virtualization of voice platforms for increased uptime and DR that virtualization affords.
Tuesday, January 5, 2010
Virtualizing Desktops and Storage
David Marshall from Infoworld/VMblog has posted interviews on 2010 predictions.
This is a post from DataCore solution provider Helixstorm;
Check out the full article:
Desktop and Storage Virtualization are the Next "Big Wave"
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Virtualizing Desktops and Storage
Aaron Schneider - Director of Sales Engineering for Helixstorm, Inc.
Virtual servers and consolidation drove the first wave.
Aggregating compute resources through virtualization is now a black and white cost-savings value proposition. Server virtualization has proven itself by consolidating workloads and reducing hardware, power, cabling, and cooling costs. Many companies have already made the jump into virtualization and have successfully increased their service level agreements (SLAs) and implemented rapid application delivery to deal with ever changing business processes and to better protect their data through business continuity.
So what else can we do with virtualization and clouds?
While server virtualization for consolidation may be somewhat saturated, there are still very large untapped areas where virtualization or cloud computing will make a huge impact. Desktop and storage virtualization will be the next "big wave." There has been plenty of activity/discussions/buzz around adopting desktop computing into the cloud. However, virtualizing the desktop has nowhere near the adoption rate as server virtualization - even though the underlying technologies and concepts are identical. Virtual Desktop Infrastructure (VDI) has historically fallen short in two (2) keys areas - user experience delivery and short term ROI. At the end of the day, there is not a CEO on the planet that will sacrifice a poor user experience on the desktop to make life easier for IT. On the ROI front, there is a significant investment made to adopt desktop virtualization (server, storage, software, thin clients), which is why some organizations turn to vendors that offer desktops through SaaS (Software as a Service). However, over time, there is a strong ROI over the traditional desktop. In our opinion, the only true way to implement desktop virtualization is to de-couple the operating system from the applications and then in "real-time" mesh the two together and deliver to the end-user the usability they need. Citrix and VMware have the leading and most stable approaches for these types of implementations right now.
Organizations that have adopted or are looking to adopt server and desktop virtualization need to look strongly at the storage aspect. Virtualizing the storage layer to have a virtual infrastructure is critical. This gives companies the ability to implement Enterprise features to deliver business continuity such as synchronous and asynchronous replication to safeguard corporate information, without the "Enterprise" price tag. This is a key topic, since we find that many companies in the SMB market need the ability to provide Enterprise level features of a traditional storage solution, at a price point that fits their budget. A software approach to storage virtualization, such as that provided by DataCore software, does just that. It protects your software investment while the value of the hardware continues to depreciate on a daily basis.
The virtualization marketplace is still filled with solutions to help you improve and implement business processes. The impact of virtualization is spreading and this year it will lead to greater success for desktop and storage virtualization. In terms of desktops, new technologies like PCoIP (PC over IP) along with advancements in several connection broker technologies look to finally solve the end- user experience and deliver a seamless, near-native desktop computing platform. Storage virtualization as a software solution is breaking down the remaining hardware constraints and continues to evolve so that today it has become amore cost-effective alternative over traditional-based arrays and is doing so without sacrificing performance or features. The key point is that the technology now exists to build virtual infrastructures and clouds, but it has to fit the budgets. Cost-effective storage virtualization solutions are a critical factor to the further growth of both server and desktop adoption in the marketplace.
This is a post from DataCore solution provider Helixstorm;
Check out the full article:
Desktop and Storage Virtualization are the Next "Big Wave"
http://vmblog.com/archive/2009/12/31/desktop-and-storage-virtualization-are-the-next-big-wave.aspx
Virtualizing Desktops and Storage
Aaron Schneider - Director of Sales Engineering for Helixstorm, Inc.
Virtual servers and consolidation drove the first wave.
Aggregating compute resources through virtualization is now a black and white cost-savings value proposition. Server virtualization has proven itself by consolidating workloads and reducing hardware, power, cabling, and cooling costs. Many companies have already made the jump into virtualization and have successfully increased their service level agreements (SLAs) and implemented rapid application delivery to deal with ever changing business processes and to better protect their data through business continuity.
So what else can we do with virtualization and clouds?
While server virtualization for consolidation may be somewhat saturated, there are still very large untapped areas where virtualization or cloud computing will make a huge impact. Desktop and storage virtualization will be the next "big wave." There has been plenty of activity/discussions/buzz around adopting desktop computing into the cloud. However, virtualizing the desktop has nowhere near the adoption rate as server virtualization - even though the underlying technologies and concepts are identical. Virtual Desktop Infrastructure (VDI) has historically fallen short in two (2) keys areas - user experience delivery and short term ROI. At the end of the day, there is not a CEO on the planet that will sacrifice a poor user experience on the desktop to make life easier for IT. On the ROI front, there is a significant investment made to adopt desktop virtualization (server, storage, software, thin clients), which is why some organizations turn to vendors that offer desktops through SaaS (Software as a Service). However, over time, there is a strong ROI over the traditional desktop. In our opinion, the only true way to implement desktop virtualization is to de-couple the operating system from the applications and then in "real-time" mesh the two together and deliver to the end-user the usability they need. Citrix and VMware have the leading and most stable approaches for these types of implementations right now.
Organizations that have adopted or are looking to adopt server and desktop virtualization need to look strongly at the storage aspect. Virtualizing the storage layer to have a virtual infrastructure is critical. This gives companies the ability to implement Enterprise features to deliver business continuity such as synchronous and asynchronous replication to safeguard corporate information, without the "Enterprise" price tag. This is a key topic, since we find that many companies in the SMB market need the ability to provide Enterprise level features of a traditional storage solution, at a price point that fits their budget. A software approach to storage virtualization, such as that provided by DataCore software, does just that. It protects your software investment while the value of the hardware continues to depreciate on a daily basis.
The virtualization marketplace is still filled with solutions to help you improve and implement business processes. The impact of virtualization is spreading and this year it will lead to greater success for desktop and storage virtualization. In terms of desktops, new technologies like PCoIP (PC over IP) along with advancements in several connection broker technologies look to finally solve the end- user experience and deliver a seamless, near-native desktop computing platform. Storage virtualization as a software solution is breaking down the remaining hardware constraints and continues to evolve so that today it has become amore cost-effective alternative over traditional-based arrays and is doing so without sacrificing performance or features. The key point is that the technology now exists to build virtual infrastructures and clouds, but it has to fit the budgets. Cost-effective storage virtualization solutions are a critical factor to the further growth of both server and desktop adoption in the marketplace.
Monday, January 4, 2010
Top News of the Day - DataCore Rocks in 2010 with 1 Petabyte Support; 2 TB max VVol size ... busted!
2 TB max VVol size ... busted!
http://www.datacore.com/forum/thread/785/re-2-tb-max-vvol-size-busted-.aspx#post787
THE TOP NEWS OF THE DAY
DataCore Super-Sizes Virtual Disks (Up to 1PB)
With its latest storage virtualization software
http://www.storagenewsletter.com/news/software/datacore-super-sizes-virtual-disks
DataCore Software responds to market demands, this time by stretching the size of its virtual disks from 2 Terabytes (TBs) to 1 Petabyte (PB).
“Rather than inch up to 4 or 16 TBs as others are considering, DataCore made the strategic design choice to blow the roof off the capacity ceiling with 1 Petabyte LUNs,” commented Augie Gonzalez, Director of Product Marketing, DataCore Software. “But we’re still frugal on the back-end, using thin-provisioning to minimize how much real capacity has to be in place day one.”
Performance wise, these immense virtual disks benefit from DataCore’s 1 TB per node, 64-bit 'mega-caches.' “You can be big, and very fast too,” added Gonzalez.
http://www.datacore.com/forum/thread/785/re-2-tb-max-vvol-size-busted-.aspx#post787
THE TOP NEWS OF THE DAY
DataCore Super-Sizes Virtual Disks (Up to 1PB)
With its latest storage virtualization software
http://www.storagenewsletter.com/news/software/datacore-super-sizes-virtual-disks
DataCore Software responds to market demands, this time by stretching the size of its virtual disks from 2 Terabytes (TBs) to 1 Petabyte (PB).
“Rather than inch up to 4 or 16 TBs as others are considering, DataCore made the strategic design choice to blow the roof off the capacity ceiling with 1 Petabyte LUNs,” commented Augie Gonzalez, Director of Product Marketing, DataCore Software. “But we’re still frugal on the back-end, using thin-provisioning to minimize how much real capacity has to be in place day one.”
Performance wise, these immense virtual disks benefit from DataCore’s 1 TB per node, 64-bit 'mega-caches.' “You can be big, and very fast too,” added Gonzalez.
Subscribe to:
Posts (Atom)