- Tech Talk 01 Get ready for a new release of GroupWise. Wanna busy search a client's calendar? Go for it. Want more panels on your home view? Track a couple of Web sites in your home view too. Take your pick of these and many more new features in the next release of GroupWise.
- Tech Talk 02 Upgrading from NetWare to the next generation of technology doesn't have to be painful—or costly. Take advantage of all the benefits Open Enterprise Server 2 offers for less and without the pain of retraining your staff. See how this migration path stacks up against moving to the big unknown—Windows Server 2008.
- Tech Talk 03 Try it just once and it'll be the last time you'll want to call support. It's faster, cheaper and unlimited. See how this new support option is going to change the way you want support from Novell.
- Tech Talk 04 Does your enterprise have legacy systems that won't talk to other computers? The hotel industry had that problem-until Novell stepped in. Now, using code that Novell open sourced, the hotel industry can use identity management solutions-on their legacy systems! See how you can teach your old systems to talk.
- Tech Talk 05 The word spin can have a negative connotation—unless it's PlateSpin. Read how Novell's acquisition of Toronto-based PlateSpin is going to give your data center several positive benefits. If you want your data center tasks to manage themselves, welcome to the new Novell technology from PlateSpin.
- Tech Talk 06 For beginners, databases can be frightening. But with a little help in building effective forms, you'll be on your way to populating a database that can almost take care of itself. See how here in our OpenOffice.org series cover the database application included in the free office suite.
- Tech Talk 07 If you're like most companies, your end users' teams are comprised of people across the globe. Yet they want to feel like their teammates are just down the hall. Enter Sitescape. It's now a part of Novell and provides the engine to the new Novell Teaming + Conferencing products. Find out how this recent acquisition benefits you.
- Connection Magazine March full .pdf
- Proof Point Toll Brothers, the leading builder of luxury homes in the U.S., had issues. One was managing desktops across 300 locations, including construction site trailers across 22 states. Keeping them in standard, working order was quite a problem. See how Novell automated that, increased the security of sensitive financial data, and much more.
- Trend Talk Are you up on your backups? Are you a synthetic backer upper? What about your recovery objectives? How will you recover after the crisis strikes? Learn what types of backup and recovery procedures are available, so when the crisis strikes, you'll be up on your backups and know just how to recover.
- Laura Chappell Analysis Session: TCP Connection Loss
Trend Talk by Amin Marts
Novell Open Enterprise Server, now in its second generation, has a deeper, richer and more capable ecosystem of support encompassing it. A major facet of that ecosystem is backup and recovery. As with any platform that serves as a foundational component of enterprise computing and collaboration, having compatible tools to recover lost files is not a luxury—it's a must have.
The world of backup and recovery is chock full of technologies that offer much greater capabilities today than they did a few short years ago. No longer is backing up data incrementally or differentially a standard best practice. Nor is it the only way to approach the task. Technologies such as synthetic backup, capacity-optimized storage/data deduplication and continuous data protection are a few of the newer capabilities vendors are offering in this space.
These technologies are especially compelling when paired with Open Enterprise Server. Unlike many technologies focused directly on the knowledge worker, implementing Open Enterprise Server lays the foundation for a comprehensive backup and data recovery plan.
This article provides an overview of the latest backup and recovery technologies available for Open Enterprise Server 2, with particular focus on the new and often misunderstood offerings in the space.
One of the newest and most important technologies in the backup and recovery space is synthetic backup; however, in the strictest sense, the term 'newest' is misleading. Synthetic backup has been on the scene for close to three years. It was introduced by a number of niche data management startups and has most recently been added to the portfolio of tier-one vendors such as Symantec and CommVault.
A major differentiator between this technology and traditional backup methodologies is the ability to create a full backup without accessing the original, online data. Enterprise backups are generally responsible for the safekeeping of terabytes of data. Depending on the available bandwidth between the data being copied and the backup device, a full backup can take upwards of an entire weekend. Many times while this is happening, endusers and applications are changing and manipulating the data. Synthetic backups mitigate this challenge by reducing the amount of time needed to complete the full backup.
Borrowing a typical scenario from the real-world, incremental backups take place Monday through Thursday with the full backup starting on Friday. During these incremental backups, data is streamed from the primary data store to a backup medium. Whether it's tape or disk, the same rules apply. Regardless of the destination, bandwidth and processing overhead, resources are consumed by both the media server and the targets. The aggregate amount of data can be relatively small per target, but that changes drastically when a full backup takes place.
During a full backup, all of the target's data is streamed to the backup device. The data sent to the backup device is quite different from that of an incremental backup, because it includes everything. Everything—meaning data that has been altered as well as data that hasn't been touched in weeks, months or years. As you might guess, this is a high-touch, resource-intensive process.
Synthetic backup transforms this resource-intensive scenario at the full backup stage. This is accomplished by leveraging the backup file meta data. Instead of streaming data from the backup target to create the full backup, data is messaged from the incremental dataset. The heavy lifting in this case is done by the media server, which orchestrates the entire process. The metadata, which is comprised of backup dates, times, data locations, and the like, is then used to create a full backup without requiring data to traverse the network.
Organizations adopting this technology have seen vast improvements in backup times. An example is the University of Montreal. They went from having to shut down production servers to conduct backups for 12 hours each weekend to performing a standard full backup only once a year. Synthetic backups pave the way for improved resource management and flexible backup strategies. This winning combination also provides for substantial cost savings in media and power consumption.
Capacity-optimized Storage (COS):
Generating a great deal of buzz within the storage industry is capacity-optimized storage, commonly referred to as data deduplication and aligned with data reduction practices. Capacity-optimized storage was first introduced to the data center in support of existing backup solutions. More recently it has migrated into a primary storage role. Although this Darwinian evolution is intriguing, the focus of this article will remain on the role of capacity-optimized storage as a complementary backup and recovery technology.
Due in part to the “data tsunami” many organizations are currently experiencing, meeting Recovery Time Objectives (RTOs) is an ongoing challenge. Organizations that are forced to replicate recovery data offsite find it especially challenging. Simply put, the overriding objective of capacity-optimized storage is to reduce the total amount of data housed on a storage medium.
Data deduplication or data reduction hinges on the illumination of patterns to identify redundant data. Redundant data, or data that remains untouched or in its original state, can consume more than 60 percent of an organization's storage capacity. Deduplication technologies simply distinguish data that has not changed from data that has—and then save only the latter. This technology eliminates the redundancy of backing up information that is unchanged.