How to access old HPC-archive data
HPC-archive is closed
HPC-archive service was used in Sisu and Taito supercomputers to provide storage space for back-ups and long term data storage.
In these servers HPC-archive was used with iRODS commands like
As these supercomputers are now phased out, also HPC-archive has been closed. If HPC-archive contained some data that you want to preserve, the data can be made available in the Allas object storage environment. However, you should act now as the HPC-archive data will not be available after year 2020.
Allas access is needed
To get access to your HPC-archive data, you need a CSC computing project that has access to Allas storage service.
You can use an existing CSC project or establish a new one. You don’t need to be the manager of the project that you will use for preserving your HPC-archive data. Note however, that all members of the Allas project will have access to your data.
You can check your current projects and services in the CSC customer portal:
Instructions for creating a project can be found here:
What to do
To start the migration process send a request to ServiceDesk
The request should contain information about:
- Your CSC user account.
- Name of the project that you will use to host your HPC-archive data
Please use your organizational e-mail address when you send the request. Once the request is processed, you will get information about the Allas bucket that contains your HPC-archive data. This bucket is added to your Allas project. It will use the storage quota and billing units of your project just like any other data stored in Allas.
Using HPC-archive data in Allas
The name of the bucket where your HPC-archive data is re-located starts with string
by a random string. For example:
The data has been transferred into this bucket using S3 protocol. In Puhti, Allas is used by default
with a different protocol (Swift) and therefore the
hpca-bucket requires some extra steps and settings.
First, when you connect the Allas area of your project, you should open the connection with both Swift and S3 protocols. This is done with command:
allas-conf --mode both
In case of
hpca- buckets the data can be downloaded with
s3cmd get s3://hpca-some_rand_string/file_name local_file_name
a-gettoo, but in that case you need to add option
--s3cmdto the command.
a-get --s3cmd hpca-some_rand_string/file_name
hpca-bucket to some another bucket, it is recommended that you first download the object to the scratch area of Puhti and then upload the data with normal a-put or rclone commands to Allas:
a-get --s3cmd hpca-some_rand_string/file_name a-put file_name -b bucket-to-upload
Last edited Fri Sep 4 2020