Apache knows there’s an urgent need for data lifecycle management for big data – and now offers Heterogeneous Storage for different storage types, as well as Hadoop Archive Storage with hot, warm, cold and other storage categories.
Building on this, Zaloni Arena platform launched, which allows for fine-grained control of data lifecycle management – at the scale of big data – with the ability to create data retention policies based on whatever makes sense for your business, including age and relevancy. We can provide that level of control through metadata applied by our Zaloni Arena DataOps Platform.
Using Zaloni Arena’s user-friendly interface, enterprises can right-size their Hadoop cluster by specifying storage tiers in Hadoop, deleting old data, and exporting data from HDFS to more cost-effective storage in the cloud, such as S3. The key to all of this is automation. Our DLM enables enterprises to automate these processes with global or specific policies, which is critical for successful data lifecycle management in the data lake.
Zaloni Arena provides the following key features:
In partnership with NetApp, Zaloni tested and validated its data lifecycle management capability specifically for NetApp’s E-Series and StorageGRID Webscale hardware configurations.
Solutions like Zaloni Arena give you the control over your data that you’re accustomed to – while also benefitting from better visibility and increased data governance capabilities. Want to know more? Contact us, and we can discuss your needs.
News By: Team Zaloni
Blogs By: Matthew Caspento
Blogs By: Haley Teeples